학술논문

Deep Transfer Learning With Self-Attention for Industry Sensor Fusion Tasks
Document Type
Periodical
Source
IEEE Sensors Journal IEEE Sensors J. Sensors Journal, IEEE. 22(15):15235-15247 Aug, 2022
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Robotics and Control Systems
Sensors
Sensor fusion
Feature extraction
Deep learning
Task analysis
Transfer learning
Transformers
deep learning
natural language processing
sensor fusion
sensor data processing
smart manufacturing
Language
ISSN
1530-437X
1558-1748
2379-9153
Abstract
Monitoring of complex industrial processes can be achieved by obtaining process data by utilising various sensing modalities. The recent emergence of deep learning provides a new routine for processing multi-sensor information. However, the learning ability of shallow neural networks is insufficient, and the data amount required by deep networks is often too large for industrial scenarios. This paper provides a novel deep transfer learning method as a possible solution that offers an advantage of better learning ability of the deep network without the requirement for a large amount of training data. This paper presents how Transformer with self-attention trained from natural language can be transferred to the sensor fusion task. Our proposed method is tested on 3 datasets: condition monitoring of a hydraulic system, bearing, and gearbox dataset. The results show that the Transformer trained from natural language can effectively reduce the required data amount for using deep learning in industrial sensor fusion with high prediction accuracy. The difficult and uncertain artificial feature engineering which requires a large workload can also be eliminated, as the deep networks are able to extract features automatically. In addition, the self-attention mechanism of Transformer aids in the identification of critical sensors, hence the interpretability of deep learning in industrial sensor fusion can be improved.