학술논문

Multimodal Monitoring of Activities of Daily Living for Elderly Care
Document Type
Periodical
Source
IEEE Sensors Journal IEEE Sensors J. Sensors Journal, IEEE. 24(7):11459-11471 Apr, 2024
Subject
Signal Processing and Analysis
Communication, Networking and Broadcast Technologies
Components, Circuits, Devices and Systems
Robotics and Control Systems
Sensors
Monitoring
Wearable devices
Cameras
Older adults
Microphones
Activity recognition
Activity monitoring
elderly care
multimodality
wearable computing
Language
ISSN
1530-437X
1558-1748
2379-9153
Abstract
In this article, we presented a multimodal approach to monitor older adults’ activities of daily living (ADLs) using the combination of a wearable device and a companion robot. A dynamic Bayesian network (DBN) model was developed for activity recognition, which fuses different data, including location, object, sound event, body action, and time. The walking action is detected as the transition between consecutive activities, which helps capture the inception of activities and save energy on the wearable device. Three tests were conducted to evaluate the proposed approach. First, multiple daily activities were simulated and evaluated the approach based on a public ADL dataset. Second, the proposed approach was tested based on an offline dataset collected in our smart home testbed, which contains images, sound events, motion, and time data. Third, the proposed approach was tested in real time and a web-based interface was developed, which helps caregivers better monitor the ADLs of older adults and provide further assistance. In the offline test and the real-time test, the results show that the system achieved 91% and 93% activity detection ratio, respectively, which significantly outperformed the baseline periodic sampling methods. In addition, the camera and microphone sensor trigger times were reduced from 1537 to 140 and 78, leading to energy reduction of 36.0% and 37.6% on the wearable device, respectively.