학술논문

Visualizing Hand Force with Wearable Muscle Sensing for Enhanced Mixed Reality Remote Collaboration
Document Type
Periodical
Source
IEEE Transactions on Visualization and Computer Graphics IEEE Trans. Visual. Comput. Graphics Visualization and Computer Graphics, IEEE Transactions on. 29(11):4611-4621 Nov, 2023
Subject
Computing and Processing
Bioengineering
Signal Processing and Analysis
Force
Collaboration
Virtual reality
Task analysis
Force measurement
Mixed reality
Data visualization
Remote collaboration
mixed reality
sensing
visualization
remote assistance
Language
ISSN
1077-2626
1941-0506
2160-9306
Abstract
In this paper, we present a prototype system for sharing a user's hand force in mixed reality (MR) remote collaboration on physical tasks, where hand force is estimated using wearable surface electromyography (sEMG) sensor. In a remote collaboration between a worker and an expert, hand activity plays a crucial role. However, the force exerted by the worker's hand has not been extensively investigated. Our sEMG-based system reliably captures the worker's hand force during physical tasks and conveys this information to the expert through hand force visualization, overlaid on the worker's view or on the worker's avatar. A user study was conducted to evaluate the impact of visualizing a worker's hand force on collaboration, employing three distinct visualization methods across two view modes. Our findings demonstrate that sensing and sharing hand force in MR remote collaboration improves the expert's awareness of the worker's task, significantly enhances the expert's perception of the collaborator's hand force and the weight of the interacting object, and promotes a heightened sense of social presence for the expert. Based on the findings, we provide design implications for future mixed reality remote collaboration systems that incorporate hand force sensing and visualization.