학술논문

Deep Residual Multiscale Convolutional Neural Network With Attention Mechanism for Bearing Fault Diagnosis Under Strong Noise Environment
Document Type
Periodical
Source
IEEE Sensors Journal IEEE Sensors J. Sensors Journal, IEEE. 24(6):9073-9081 Mar, 2024
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Robotics and Control Systems
Convolution
Feature extraction
Convolutional neural networks
Fault diagnosis
Vibrations
Sensors
Computer architecture
Attention mechanism
bearing fault diagnosis
convolutional neural networks
intelligent fault diagnosis
strong noise
Language
ISSN
1530-437X
1558-1748
2379-9153
Abstract
In recent years, deep learning (DL) methods have gained much success in the area of intelligent fault diagnosis. However, due to the fact that the working conditions are various and the noise is inevitable, degradation of previous model is very serious. To address the challenge of bearing fault detection under strong noise environment, this article proposed a novel antinoise deep residual multiscale convolutional neural network with attention mechanism named Attention-MSCNN. First, dynamic dropout is used to improve the antinoise ability by introducing artificial noise into the training process. In addition, we design a residual connection between input and the convolved features to fully capture the characteristics of the initial input. Finally, a novel denoised multihead attention mechanism is applied to remove excess noise in raw input and obtain the relationships between long time series. The experimental results show that Attention-MSCNN can achieve robust anti strong noise performance with over 85% accuracy on the Case Western Reserve University (CWRU) dataset. On the self-collected two-stage gear drive test bench, our model achieves an accuracy of over 99% under strong noise environment. Thus, Attention-MSCNN successfully solves the problem of low detection accuracy of previous models under strong noise environment.