학술논문

Keep It Simple: Data-Efficient Learning for Controlling Complex Systems With Simple Models
Document Type
Periodical
Source
IEEE Robotics and Automation Letters IEEE Robot. Autom. Lett. Robotics and Automation Letters, IEEE. 6(2):1184-1191 Apr, 2021
Subject
Robotics and Control Systems
Computing and Processing
Components, Circuits, Devices and Systems
Data models
Complex systems
Uncertainty
Task analysis
Predictive models
Aerospace electronics
Heuristic algorithms
Machine learning for robot control
motion and path planning
Language
ISSN
2377-3766
2377-3774
Abstract
When manipulating a novel object with complex dynamics, a state representation is not always available, for example, deformable objects. Learning both a representation and dynamics from observations requires large amounts of data. We propose Learned Visual Similarity Predictive Control (LVSPC), a novel method for data-efficient learning to control systems with complex dynamics and high-dimensional state spaces from images. LVSPC leverages a given simple model approximation from which image observations can be generated. We use these images to train a perception model that estimates the simple model state from observations of the complex system online. We then use data from the complex system to fit the parameters of the simple model and learn where this model is inaccurate, also online. Finally, we use Model Predictive Control and bias the controller away from regions where the simple model is inaccurate and thus where the controller is less reliable. We evaluate LVSPC on two tasks; manipulating a tethered mass and a rope. We find that our method performs comparably to state-of-the-art reinforcement learning methods with an order of magnitude less data . LVSPC also completes the rope manipulation task on a real robot with 80% success rate after only 10 trials, despite using a perception system trained only on images from simulation.