학술논문

Modelling and Measuring Trust in Human–Robot Collaboration
Document Type
article
Source
Applied Sciences, Vol 14, Iss 5, p 1919 (2024)
Subject
Human–Robot Collaboration (HRC)
trust dimensions
trust dynamics
experimental process
Technology
Engineering (General). Civil engineering (General)
TA1-2040
Biology (General)
QH301-705.5
Physics
QC1-999
Chemistry
QD1-999
Language
English
ISSN
2076-3417
Abstract
Recognizing trust as a pivotal element for success within Human–Robot Collaboration (HRC) environments, this article examines its nature, exploring the different dimensions of trust, analysing the factors affecting each of them, and proposing alternatives for trust measurement. To do so, we designed an experimental procedure involving 50 participants interacting with a modified ‘Inspector game’ while we monitored their brain, electrodermal, respiratory, and ocular activities. This procedure allowed us to map dispositional (static individual baseline) and learned (dynamic, based on prior interactions) dimensions of trust, considering both demographic and psychophysiological aspects. Our findings challenge traditional assumptions regarding the dispositional dimension of trust and establish clear evidence that the first interactions are critical for the trust-building process and the temporal evolution of trust. By identifying more significant psychophysiological features for trust detection and underscoring the importance of individualized trust assessment, this research contributes to understanding the nature of trust in HRC. Such insights are crucial for enabling more seamless human–robot interaction in collaborative environments.