학술논문

MarNASNets: Toward CNN Model Architectures Specific to Sensor-Based Human Activity Recognition
Document Type
Periodical
Source
IEEE Sensors Journal IEEE Sensors J. Sensors Journal, IEEE. 23(16):18708-18717 Aug, 2023
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Robotics and Control Systems
Convolutional neural networks
Human activity recognition
Computer architecture
Computational modeling
Sensors
Smart phones
Image recognition
Convolutional neural network (CNN)
deep learning (DL)
human activity recognition (HAR)
neural architecture search (NAS)
Language
ISSN
1530-437X
1558-1748
2379-9153
Abstract
Deep learning (DL) models for sensor-based human activity recognition (HAR) are still in their nascent stages compared with image recognition. HAR’s inference is generally implemented on edge devices such as smartphones because of its secure privacy. However, lightweight DL models for HAR, while meeting the hardware limitations, are lacking. In this study, using the neural architecture search (NAS), we investigated an effective DL model architecture that can be used for inference on smartphones. We designed multiple search spaces for the type of convolution, the kernel size of the convolution process, the type of skip operation, the number of layers, and the number of output filters by Bayesian optimization. We propose models called mobile-aware convolutional neural network (CNN) for sensor-based HAR by NAS (MarNASNets). We constructed four MarNASNet networks, MarNASNet-A to -D, each with a different model size and a parameter search space of four patterns. Experimental results show that MarNASNets achieve the same accuracy as the existing CNN architectures with fewer parameters and are effective model architectures for on-device and sensor-based HAR. We also developed Activitybench, an iOS app, for measuring model performance on smartphones, and evaluated the on-device performance of each model. MarNASNets’ exploration achieved accuracy comparable to the existing CNN models with smaller model sizes. MarNASNet-C achieved accuracies of 92.60%, 94.52%, and 88.92% for Human Activity Sensing Consortium (HASC), UCI, and Wireless Sensor Data Mining (WISDM), respectively. Especially for HASC and UCI, MarNASNet-C achieved the highest accuracies despite the small model size. Their latency was also comparable to that of the existing CNN models, enabling real-time on-device inference.