학술논문

Boosting Inertial-Based Human Activity Recognition With Transformers
Document Type
Periodical
Source
IEEE Access Access, IEEE. 9:53540-53547 2021
Subject
Aerospace
Bioengineering
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Computing and Processing
Engineered Materials, Dielectrics and Plasmas
Engineering Profession
Fields, Waves and Electromagnetics
General Topics for Engineers
Geoscience
Nuclear Engineering
Photonics and Electrooptics
Power, Energy and Industry Applications
Robotics and Control Systems
Signal Processing and Analysis
Transportation
Legged locomotion
Task analysis
Activity recognition
Belts
Stairs
Accelerometers
Magnetic heads
Human activity recognition
smartphone location recognition
inertial sensors
pedestrian dead reckoning
convolutional neural networks
Transformers
sequence analysis
Language
ISSN
2169-3536
Abstract
Activity recognition problems such as human activity recognition and smartphone location recognition can improve the accuracy of different navigation or healthcare tasks, which rely solely on inertial sensors. Current learning-based approaches for activity recognition from inertial data employ convolutional neural networks or long short term memory architectures. Recently, Transformers were shown to outperform these architectures for sequence analysis tasks. This work presents an activity recognition model based on Transformers which offers an improved and general framework for learning activity recognition tasks. For evaluation purposes, several datasets, with more than 27 hours of inertial data recordings collected by 91 users, are employed. Those datasets represent different user activity scenarios with varying difficulty. The proposed approach consistently achieves better accuracy and generalizes better across all examined datasets and scenarios. A codebase implementing the described framework is available at: https://github.com/yolish/har-with-imu-transformer.