2019 PHM Experimental Design for Effective State Separation using Jensen-Shannon Divergence
본문
- Journal
- Reliability Engineering & System Safety
- Date
- 2019-10
- Citation Index
- SCIE (IF: 9.4, Rank: 3.3%)
- Vol./ Page
- Vol. 190, pp. 106503
- Year
- 2019
- Link
- http://doi.org/10.1016/j.ress.2019.106503 172회 연결
Abstract
The most critical aspect in system condition diagnosis is to be aware of the signal change as the system degrades and to classify the signal into a proper state. Most condition diagnosis studies to date have focused on accurate classification of the tangled data set; these studies have paid less attention to experimental design, which could determine the quality of the signal. This study deals with experimental design for condition diagnosis that allows data separation between different health conditions in the sensing step. Data separation helps reduce the cost of follow-up data processing, such as classifier development. For this purpose, an experimental design method is proposed from an information theoretic point of view to find the sensing locations that give the best information for data separation. To measure the goodness of the design point, the Jensen–Shannon (JS) divergence is adopted as the utility function. The JS divergence is relevant to condition diagnosis in three ways. First, the JS divergence is good at measuring the separation of two distributions, which is the desired property for condition diagnosis. Second, the boundedness of the JS divergence enables balanced separation between the states. Third, the symmetric property of the JS divergence is relevant for measurement of the overall separation of the distance between states. With the JS divergence as the utility function, the greedy algorithm with the Gaussian process is used for efficient optimization. The greedy algorithm chooses one design point at a time. The required computation increases linearly with the number of sensors. At each optimization iteration, the joint distribution is required for calculating the JS divergence; the Gaussian process allows the JS divergence to be found by only calculating the univariate distribution for the newly added measuring location and the previously constructed joint distribution for the existing measurement locations. Finally, the proposed method is validated through two case studies: one is a simple mathematical problem that is provided to show the general effectiveness of JS divergence and the overall process of the proposed method; the other examines gearbox condition diagnosis to examine the use of the JS divergence in a practical application.