Project Details
Gestural interaction paradigms for smart spaces (GrIPSs)
Applicant
Professorin Dr. Susanne Boll
Subject Area
Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term
since 2024
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 548194231
Free hand gestures provide a powerful and natural way to interact with computers using hands and body movements in a variety of application domains such as gaming, education, smart rooms, and medical applications. Despite extensive research on 3D gesture interaction over the years, much of it has been focused on specific devices within specific domains. The absence of a standardized gesture vocabulary that can be applied to diverse devices and domains makes it challenging for interaction designers to determine which gestures to employ for a given application. The GrIPSs project focuses on mid-air gesture interaction in smart environments, investigates a potential universal gesture vocabulary, and addresses the challenges associated with the transferability of gestures across different domains and devices. In the initial phase of the project, we established a consensus gesture set that can be applied across a variety of domains, classified gestures along dimensions of a taxonomy, evaluated user preferences in smart homes, developed tools for collecting 3D gestures, and collected a large mid-air gesture dataset representing the most common gestures in the literature. In the extension proposal we will address various aspects of gestural interaction. Firstly, we will establish a standardized notation system, utilizing Labanotation and GestureML, along with creating a simple grammar for gesture sequences. This aims to enhance reproducible and human-accessible gestures. In addition, we will develop and use tools and make them accessible to our annotated gesture corpus. These include the development of a gesture notation editor to edit the data corpus, a grammar editor that provides semantics for gesture sequences, and an annotation editor that allows annotation of gestures along different dimensions. The tools along with the gesture corpus will be made available to the public to enable other researchers to conduct and reproduce our studies.
DFG Programme
Research Grants