Project Details
Online Scene Reconstruction and Understanding
Applicant
Professor Dr. Leif Kobbelt
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term
from 2018 to 2021
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 392037563
3D scenes are the result of digitizing real-world environments. In comparison to two-dimensional visual media such as images and videos, 3D scenes carry much richer free view-point information and they can capture spatial relations between objects even if they cannot be seen from the same vantage point. This makes 3D scene representations useful in a wide range of applications where location- and pose-dependent information needs to be retrieved, e.g. for autonomous vehicles or mobile augmented reality. While there have been considerable advances in 3D measurement technology as well as significant progress in efficient 3D reconstruction algorithms, the precision and quality of 3D scenes captured with today s consumer-level (portable) equipment is still not fully satisfying, especially in online scenarios where the scene information needs to be continuously updated. Moreover low-level geometric representations (e.g. point clouds) of an environment are not sufficient in many applications such that segmentation and labeling algorithms are required which should be robust against noise, distortion, and incomplete data. Ultimately we want to let agents (humans or robots) interact with their environment which makes it necessary to analyse and model interaction patterns of the agents with (segmented and labeled) objects in a 3D scene. Our goals are:- To significantly improve the precision and quality of online 3D reconstructions from streams of multi-sensor raw data by using probabilistic formulations which carefully model all types of uncertainties in the capturing process.- To perform robust online 3D scene segmentation and labeling by exploiting dynamically changing context information. Again, probabilistic formulations but in addition also machine learning methods will be applied.- To analyze interaction patterns by developing algorithms for robust hand tracking and gesture classification and by mining large repositories of 3D scenes and interaction records for data driven interaction modeling.
DFG Programme
Research Grants
International Connection
China
Partner Organisation
National Natural Science Foundation of China
Cooperation Partner
Professor Shi-Min Hu