학술논문

A Bio-Inspired Framework for Joint Angle Estimation from Non-Collocated Sensors in Tendon-driven Systems
Document Type
Conference
Source
2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Intelligent Robots and Systems (IROS), 2020 IEEE/RSJ International Conference on. :7778-7783 Oct, 2020
Subject
Robotics and Control Systems
Training
Estimation
Artificial neural networks
Robot sensing systems
Sensor systems
Task analysis
Tendons
Language
ISSN
2153-0866
Abstract
Estimates of limb posture are critical for the control of robotic systems. This is generally accomplished by utilizing on-location joint angle encoders which may complicate the design, increase limb inertia, and add noise to the system. Conversely, some innovative or smaller robotic morphologies can benefit from non-collocated sensors when encoder size becomes prohibitively larger or the joints are less accessible or subject to damage (e.g., distal joints of a robotic hand or foot sensors subject to repeated impact). These concerns are especially important for tendon-driven systems where motors (and their sensors) are not placed at the joints. Here we create a framework for joint angle estimation by which artificial neural networks (ANNs) use limited-experience from motor babbling to predict joint angles. We draw inspiration from Nature where (i) muscles and tendons have mechanoreceptors, (ii) there are no dedicated joint-angle sensors, and (iii) dedicated neural networks perform sensory fusion. We simulated an inverted pendulum driven by an agonist-antagonist pair of motors that pull on tendons with nonlinear elasticity. We then compared the contributions of different sets of non-collocated sensory information when training ANNs to predict joint angle. By comparing performance across different movement tasks we were able to determine how well each ANN (trained on the different sensory sets of babbling data) generalizes to tasks it has not been exposed to (sinusoidal and point-to-point). Lastly, we evaluated performance as a function of amount of babbling data. We find that training an ANN with actuator states (i.e., motor positions/velocities/accelerations) as well as tendon tension data produces more accurate estimates of joint angles than those ANNs trained without tendon tension data. Moreover, we show that ANNs trained on motor positions/velocities and tendon tensions (i.e., the bio-inspired set) (i) can reliably estimate joint angles with as little as 2 minutes of motor babbling and (ii) generalizes well across tasks. We demonstrate a novel framework that can utilize limited-experience to provide accurate and efficient joint angle estimation during dynamical tasks using non-collocated actuator and tendon tension measurements. This enables novel designs of versatile and data-efficient robots that do not require on-location joint angle sensors.