학술논문

Learning Hand-eye Coordination for a Humanoid Robot using SOMs
Document Type
Conference
Source
2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI) Human-Robot Interaction (HRI), 2014 9th ACM/IEEE International Conference on. :192-193 Mar, 2014
Subject
Robotics and Control Systems
Robot sensing systems
Robot kinematics
Neurons
Self-organizing feature maps
Humanoid robots
Manipulators
cognitive robotics
HRI
sensorimotor coupling
Language
Abstract
Hand-eye coordination is an important motor skill acquired in infancy which precedes pointing behavior. Pointing facilitates social interactions by directing attention of engaged participants. It is thus essential for the natural flow of human-robot interaction. Here, we attempt to explain how pointing emerges from sensorimotor learning of hand-eye coordination in a humanoid robot. During a body babbling phase with a random walk strategy, a robot learned mappings of joints for different arm postures. Arm joint configurations were used to train biologically inspired models consisting of SOMs. We show that such a model implemented on a robotic platform accounts for pointing behavior while humans present objects out of reach of the robot’s hand.