TY - BOOK AU - Zhou,Xuefeng AU - Wu,Hongmin AU - Rojas,Juan AU - Xu,Zhihao AU - Li,Shuai ED - SpringerLink (Online service) TI - Nonparametric Bayesian Learning for Collaborative Robot Multimodal Introspection SN - 9789811562631 AV - TJ210.2-211.495 U1 - 629,892 23 PY - 2020/// CY - Singapore PB - Springer Nature Singapore, Imprint: Springer KW - Robotics KW - Statistics  KW - Control engineering KW - Automation KW - Machine learning KW - Mathematical models KW - Robotic Engineering KW - Bayesian Inference KW - Control, Robotics, Automation KW - Machine Learning KW - Mathematical Modeling and Industrial Mathematics N1 - Introduction to Robot Introspection -- Nonparametric Bayesian Modeling of Multimodal Time Series -- Incremental Learning Robot Complex Task Representation and Identification -- Nonparametric Bayesian Method for Robot Anomaly Monitoring -- Nonparametric Bayesian Method for Robot Anomaly Diagnose -- Learning Policy for Robot Anomaly Recovery based on Robot; Open Access N2 - This open access book focuses on robot introspection, which has a direct impact on physical human–robot interaction and long-term autonomy, and which can benefit from autonomous anomaly monitoring and diagnosis, as well as anomaly recovery strategies. In robotics, the ability to reason, solve their own anomalies and proactively enrich owned knowledge is a direct way to improve autonomous behaviors. To this end, the authors start by considering the underlying pattern of multimodal observation during robot manipulation, which can effectively be modeled as a parametric hidden Markov model (HMM). They then adopt a nonparametric Bayesian approach in defining a prior using the hierarchical Dirichlet process (HDP) on the standard HMM parameters, known as the Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM). The HDP-HMM can examine an HMM with an unbounded number of possible states and allows flexibility in the complexity of the learned model and the development of reliable and scalable variational inference methods. This book is a valuable reference resource for researchers and designers in the field of robot learning and multimodal perception, as well as for senior undergraduate and graduate university students UR - https://doi.org/10.1007/978-981-15-6263-1 ER -