Project Details
Projekt Print View

Gestural Interaction Paradigms for Smart Spaces

Subject Area Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term from 2020 to 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 425868555
 
Freehand gestures have great potential to shape the interaction with pervasive computing environments and might constitute, beside touch, the main interaction paradigm in the future. While 3D gestural interaction has been explored for many years now, the scientific knowledge on their syntactical structure and the applicability across different pervasive computing environments is still fragmented and bound to specific confined domains. This makes it very difficult for interaction designers of pervasive computing environments to decide which specific gestures should be used in which situations and environments. To alleviate this problem, this proposal aims to understand the common characteristics of gestures in pervasive computing environments to elaborate on factors of their “success” across situations and domains. Based on the observation that robust techniques to recognize gestures on a behavioural level will be soon broadly available (through various sensors, e.g. depth-cameras), we aim to develop and evaluate models that cover all relevant syntactic and semantic aspects of gestures that can be considered relevant for the successful application of gestures in pervasive computing environments. For this purpose we plan to develop a general vocabulary of gestures and a gesture grammar that generalizes over domains. These will be based upon a large-scale elicitation study in two domains: a smart home and an intelligent retail environment that will result in a large gesture corpus for further analysis. A novel gesture notation will be provided to help annotate the corpus and to reproduce freehand gestures for further evaluation purposes. The intended gestural model will be based on four dimensions, related to the social, spatial, physical and cognitive demands associated with freehand gestures. In the course of the project corresponding metrics will be developed to allow for an informative comparison of gestures. The metrics will also help to annotate the corpus. Our findings will be validated empirically in large-scale field as well as lab studies. As a further result of the project, we plan to provide a software suite that allows other researchers to run and replicate our studies. During the project we will develop and successively use individual modules of the software suit and make our elicited and annotated gesture corpus available. This will include the development of a gesture notation editor, which will be used to annotate the corpus, a annotation editor that will facilitate annotation of gesture along the four dimensions described above, a grammar editor to provide semantics to gesture sequences, and an evaluation toolkit to support online user studies with a given gesture corpus. The tools will be integrated into a tool chain, that will be not only used within the project, but made publicly available to the community.
DFG Programme Priority Programmes
 
 

Additional Information

Textvergrößerung und Kontrastanpassung