학술논문

Partially Observable Markov Decision Processes in Robotics: A Survey
Document Type
Periodical
Source
IEEE Transactions on Robotics IEEE Trans. Robot. Robotics, IEEE Transactions on. 39(1):21-40 Feb, 2023
Subject
Robotics and Control Systems
Computing and Processing
Components, Circuits, Devices and Systems
Robots
Robot kinematics
Task analysis
Robot sensing systems
Planning
Markov processes
Uncertainty
AI-based methods
autonomous agents
partially observable Markov decision process (POMDP)
planning under uncertainty
scheduling and coordination
Language
ISSN
1552-3098
1941-0468
Abstract
Noisy sensing, imperfect control, and environment changes are defining characteristics of many real-world robot tasks. The partially observable Markov decision process (POMDP) provides a principled mathematical framework for modeling and solving robot decision and control tasks under uncertainty. Over the last decade, it has seen many successful applications, spanning localization and navigation, search and tracking, autonomous driving, multirobot systems, manipulation, and human–robot interaction. This survey aims to bridge the gap between the development of POMDP models and algorithms at one end and application to diverse robot decision tasks at the other. It analyzes the characteristics of these tasks and connects them with the mathematical and algorithmic properties of the POMDP framework for effective modeling and solution. For practitioners, the survey provides some of the key task characteristics in deciding when and how to apply POMDPs to robot tasks successfully. For POMDP algorithm designers, the survey provides new insights into the unique challenges of applying POMDPs to robot systems and points to promising new directions for further research.