Project Details
Model predictive motion planning for robot-assisted observation and recording of human activities
Applicant
Professor Dr.-Ing. Torsten Bertram
Subject Area
Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Term
since 2022
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 497071854
Authentic data of human activities are essential in many fields but at the same time difficult and costly to acquire. Data is used, for example, for documentation purposes in medicine, aerospace, and also in ethology for research on human-robot interaction. In machine learning such as deep learning or learning from observation, the quantity and quality of data determine the performance of new approaches. Robotic observationand recording of human activities is a promising way to economically access rare and exclusive data. Nowadays, model predictive control is state of the art for real-time motion planning under the aspects of collision avoidance as well as time and path optimality for robotic manipulators. The research project is therefore dedicated to model predictive motion planning for a seamless recording of human activities and the associated changes to the environment by a robot-guided eye-in-hand camera. The goal is to maximize the information gain accumulated over the entire camera motion by recording from changing perspectives in close proximity to the activity without disturbing the human. With the successful application of model predictive control in robotics, new challenges arise following this objective, both, in terms of collision avoidance under uncertainties of human motion, and systematic development of task-specific cost functions that evaluate the success of the entire spatio-temporal motion with respect to a higher-level overall goal.The methods will be evaluated, both, in detail and collectively in the context of an exemplary application in which a robotic manipulator will reproduce the observed activity using learning from observation.
DFG Programme
Research Grants