Project Details
Projekt Print View

RIME: Rich Interactive Materials for Everyday Objects in the Home

Subject Area Image and Language Processing, Computer Graphics and Visualisation, Human Computer Interaction, Ubiquitous and Wearable Computing
Term from 2020 to 2024
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 425869111
 
The user experience of the smart home is in a crisis. Smart devices and appliances are rapidly entering the home, leading to a confusing mix of individual user interfaces to learn and remember. Consequently, users make mistakes, become frustrated, and often abandon promising solutions. Attempting to unify these interfaces through voice assistants or mid-air gestures creates invisible, hard-to-discover interfaces, awkward social situations, and quickly reaches its limits, such as when controlling a continuous value via voice. The smartphone as “universal remote” does not integrate well with natural social practices in the home and, like all touchscreens, limits our hands to tapping and swiping with 1–2 fingers on a featureless glass surface. This does not do justice to the rich interactions our hands allow.Throughout our evolution, our hands have always been how we interact with the inanimate objects and materials around us, while voice and mid-air gestures have developed specifically to interact with other living beings. This may explain the unease users frequently feel when asked to talk or gesture to objects in their smart home today. Over several million years, our hands have evolved to become capable of expressing a rich vocabulary of grasps and tangible manipulation gestures, and to touch, sense, and intimately experience the rich variety of material and object properties that can inform and delight us. The goal of RIME (Rich Interactive Materials for Everyday Objects in the Home) is to unlock this potential and design, evaluate and assess parametrically scalable, rich touch-based interactions with personal smart spaces at home and beyond.RIME tackles this challenge through a structured research agenda combining the principles of fundamental research with the user-centered approach of modern Human-Computer Interaction. At its core lies the idea to equip everyday objects in the home with interactive surface “skins” and materials that can sense this rich touch input and provide appropriate multimodal feedback. To ground our research, we first elicit current practices that involve touching everyday home objects in lab and field studies, to understand where to place potential RIME components (WP1). Based on these findings, we classify possible interaction techniques and examine ways to make them discoverable, disambiguate everyday handling from digital touch input, map input on a RIME component to its target, and scale to different users and contexts (WP2). We evaluate ways to create local and ambient feedback (WP3), and create the necessary foundations in digital fabrication to create scalable, parametric models of stretchable skins and materials that integrate touch, force, and deformation sensors, shape-changing and other haptic actuators, and visual output (WP4). We iteratively validate our results in studies with stakeholders, and integrate them into conceptual frameworks and model-based guidelines for future design tools (WP5).
DFG Programme Priority Programmes
 
 

Additional Information

Textvergrößerung und Kontrastanpassung