Entropy Measures in Machine Fault Diagnosis: Insights and Applications
Citations Over TimeTop 1% of 2020 papers
Abstract
Entropy, as a complexity measure, has been widely applied for time series analysis. One preeminent example is the design of machine condition monitoring and industrial fault-diagnostic systems. The occurrence of failures in a machine will typically lead to nonlinear characteristics in the measurements, caused by instantaneous variations, which can increase the complexity in the system response. Entropy measures are suitable to quantify such dynamic changes in the underlying process, distinguishing between different system conditions. However, notions of entropy are defined differently in various contexts (e.g., information theory and dynamical systems theory), which may confound researchers in the applied sciences. In this article, we have systematically reviewed the theoretical development of some fundamental entropy measures and clarified the relations among them. Then, typical entropy-based applications of machine fault-diagnostic systems are summarized. Furthermore, insights into possible applications of the entropy measures are explained, as to where and how these measures can be useful toward future data-driven fault diagnosis methodologies. Finally, potential research trends in this area are discussed, with the intent of improving online entropy estimation and expanding its applicability to a wider range of intelligent fault-diagnostic systems.
Related Papers
- → The relationship between transfer entropy and directed information(2012)20 cited
- → Review of Classical Information Theory(2016)1 cited
- → Comparative performance of mutual information and transfer entropy for analyzing the balance of information flow and energy consumption at synapses(2020)5 cited
- → An information transmission measure for the analysis of effective connectivity among cortical neurons(2010)