Project Details
Robot Learning of Mobile Manipulation for Intelligent Assistance
Applicant
Professorin Georgia Chalvatzaki, Ph.D.
Subject Area
Automation, Mechatronics, Control Systems, Intelligent Technical Systems, Robotics
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term
since 2021
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 448644653
Robotic household assistants have been a major goal of artificial intelligence and robotics since the fields’ inception. Significant recent progress in perception and manipulation learning for fixed-base robot arms with statically installed cameras indicates that this dream may finally be feasible. However, robot household assistants require intelligent mobile manipulation capabilities to safely enter every-day living environments, to interact with humans, and to fulfill basic household tasks. Unlike in perception or stationary robot manipulation setups, intelligent mobile manipulation cannot rely on brute-force machine learning approaches with large real-world data sets but needs to merge the advantages of classical model-based robotics with modern machine learning methods. Such a hybrid approach is likely to succeed as mobile manipulators are often less constrained in their actions than a stationary robot arm. Thus, the iROSA project aims to investigate a novel robot learning approach to mobile manipulation for intelligent assistance. Three fundamental research problems will be addressed with our idea of augmenting classical robotics with learning methods: (i) robust mobile manipulation, (ii) fluent human-robot interaction, and (iii) adaptive planning for accomplishing household tasks. Each of these core problems will be studied within realistic but minimal testbeds such as fetching & carrying of objects, object handover, and combinations thereof. We expect the following methodological advances: (a) A novel mobile grasp learning methodology based on the hybrid approach for fetch & carry tasks. Resulting methods extract orientation-invariant features for effective mobile grasping when approaching objects from different viewpoints while employing an adversarial object generator to be robust towards potential optimization biases. (b) Improved human-robot interaction techniques that enable fluent human-robot object handovers. Learning interaction dynamics models, incorporating human motion intention prediction, and the constraints of human workspace enable effective interaction. (c) A framework for addressing sequences of tasks in (a,b) by bringing more learning into task and motion planning for mobile manipulators. This framework will acquire dynamic task sequences and adapt the required low-level control policies accordingly. All three advances will be experimentally evaluated on a physical mobile manipulator robot. The scenario of tidying up the room from the toys thrown on the floor is addressed by a robot that has to fetch toys, carry them through the room, and hand them over to a human. The PI has been at the forefront of research on intelligent robotics assistance in a large number of real-world scenarios ranging from human-robot interaction to robot grasping. Her expertise at the intersection of classical robotics and applicable machine learning, as well as her hands-on experience with the TIAGO++ robot, will make this project possible.
DFG Programme
Independent Junior Research Groups
Major Instrumentation
Mobile Manipulation Robot
Instrumentation Group
2320 Greif- und Hebewerkzeuge, Verladeeinrichtungen