학술논문

Eyes on the Task: Gaze Analysis of Situated Visualization for Collaborative Tasks
Document Type
Conference
Source
2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR) VR Virtual Reality and 3D User Interfaces (VR), 2024 IEEE Conference. :785-795 Mar, 2024
Subject
Computing and Processing
Visualization
Three-dimensional displays
Collaboration
Gaze tracking
User interfaces
User experience
Task analysis
Human-centered computing
Visualization techniques
Visualization design and evaluation methods
Language
ISSN
2642-5254
Abstract
The use of augmented reality technology to support humans with situated visualization in complex tasks such as navigation or assembly has gained increasing importance in research and industrial applications. One important line of research regards supporting and understanding collaborative tasks. Analyzing collaboration patterns is usually done by conducting observations and interviews. To expand these methods, we argue that eye tracking can be used to extract further insights and quantify behavior. To this end, we contribute a study that uses eye tracking to investigate participant strategies for solving collaborative sorting and assembly tasks. We compare participants’ visual attention during situated instructions in AR and traditional paper-based instructions as a baseline. By investigating the performance and gaze behavior of the participants, different strategies for solving the provided tasks are revealed. Our results show that with situated visualization, participants focus more on task-relevant areas and require less discussion between collaboration partners to solve the task at hand.