Journal of Process Control, Vol.59, 28-36, 2017
A modified Kullback divergence for direct fault detection in large scale systems
The present paper deals with the problem of fault detection in highly coupled large-scale industrial systems typically operating in noisy environments. The detection algorithm is based on a non-parametric approximation of a modified Kullback-Leibler divergence. Since the dissimilarity between multidimensional probability densities is usually quantified with a scalar quantity that belongs to the f-divergence family, the problem of multidimensional process monitoring can be reduced to such a simpler task where any deviation from the normal operation can be detected in this one dimensional signal. With the modified Kullback-Leibler distance faults can be directly detected without normality assumption or the joint monitoring of related test statistics in different subspaces. However, due to the strong dependence of such metrics on asymptotic density estimates, which is inherently a difficult and cumbersome task especially in high-dimensional problems which may lead to unreliable measures, an alternative way to tackle the problem is to estimate the ratio of densities rather than the densities themselves. The objective of this work is to exploit such a new approach in detecting abnormalities in real industrial systems and confirm its applicability using the industrial benchmark of Tennessee Eastman process. (C) 2017 Elsevier Ltd. All rights reserved.
Keywords:FDI;Change-point detection;Kernel methods;Density ratio;Kullback-Leibler;Tennessee Eastman process;Machine learning