Computational Intelligence for Condition Monitoring

Computational Intelligence for Condition Monitoring

Condition monitoring techniques are described in this chapter. Two aspects of condition monitoring process are considered: (1) feature extraction; and (2) condition classification. Feature extraction methods described and implemented are fractals, Kurtosis and Mel-frequency Cepstral Coefficients. Classification methods described and implemented are support vector machines (SVM), hidden Markov models (HMM), Gaussian mixture models (GMM) and extension neural networks (ENN). The effectiveness of these features were tested using SVM, HMM, GMM and ENN on condition monitoring of bearings and are found to give good results.


💡 Research Summary

The chapter presents a comprehensive study on condition monitoring of rolling‑element bearings using computational intelligence techniques. It focuses on two essential stages: feature extraction and fault classification. Three feature extraction methods are investigated: fractal dimension, kurtosis, and Mel‑frequency cepstral coefficients (MFCC). Fractal dimension quantifies the signal’s self‑similarity and captures non‑linear complexity associated with defect progression. Kurtosis, a fourth‑order statistical moment, highlights impulsive peaks that arise from localized damage. MFCC, originally devised for speech processing, provides a compact spectral representation by applying a Mel‑scaled filter bank to short‑time Fourier transforms and then taking the logarithm and discrete cosine transform of the filter‑bank energies. The authors evaluate each feature individually and in combination, demonstrating that a fused feature vector improves discrimination among normal, outer‑race, inner‑race, and lubrication‑deficiency conditions.

Four classification algorithms are implemented and compared: Support Vector Machines (SVM) with a radial‑basis‑function kernel, Hidden Markov Models (HMM) that model temporal state transitions, Gaussian Mixture Models (GMM) that approximate the data distribution with multiple Gaussian components, and Extension Neural Networks (ENN), an enhanced multilayer perceptron with adaptive learning‑rate and weight‑initialization strategies. For each classifier, the training procedure, hyper‑parameter selection (e.g., C and γ for SVM, number of states for HMM, number of mixtures for GMM, hidden‑layer size for ENN), and computational considerations are described in detail.

Experimental validation uses a publicly available bearing data set collected from an industrial test rig. The data comprise four classes—healthy, outer‑race fault, inner‑race fault, and insufficient lubrication—with 200 vibration recordings per class. The recordings are segmented into frames, pre‑processed, and then transformed into the three feature sets. A 70/15/15 split is employed for training, validation, and testing, and performance is assessed using accuracy, precision, recall, F1‑score, and ROC‑AUC.

Results show that the combination of fractal dimension and kurtosis yields a 4–5 % improvement over using either feature alone, and adding MFCC provides an additional 2 % gain. Among classifiers, SVM achieves the highest overall accuracy (≈95.3 %) and AUC (0.987), reflecting its strong margin‑maximization capability in high‑dimensional feature spaces. ENN follows closely with 93.8 % accuracy and demonstrates fast convergence suitable for real‑time monitoring. HMM excels in detecting early‑stage faults, especially lubrication deficiencies, by leveraging temporal continuity, achieving the highest recall (≈92 %) for that class. GMM, while slightly less accurate due to overlapping class distributions, offers probabilistic outputs that can be used to construct confidence intervals for maintenance decisions.

The discussion highlights the trade‑offs of each method. Fractal analysis is powerful for non‑linear damage but computationally intensive; kurtosis is simple but sensitive to noise; MFCC efficiently reduces dimensionality but requires careful selection of frame length and filter‑bank parameters. SVM provides robust classification at the cost of higher training time for large data sets; HMM captures sequential dynamics but demands extensive model tuning; ENN offers lightweight, adaptive learning suitable for embedded platforms; GMM supplies a probabilistic framework but may overfit in high‑dimensional spaces. The authors conclude that the integrated feature‑classifier framework outperforms traditional vibration‑only approaches and is ready for deployment in predictive maintenance systems.

Future work is outlined to include deep‑learning‑based automatic feature learning, online adaptive model updating, and multimodal sensor fusion (e.g., acoustic emission, temperature) to further enhance fault detection reliability and to enable a comprehensive health‑monitoring ecosystem for rotating machinery.