Project Details
GRK 2972: CAUSE Concepts and Algorithms for – and Usage of – Self-Explaining Digitally Controlled Systems
Subject Area
Computer Science
Term
since 2024
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 513623283
Today, almost any technical system is digitally controlled which adds to its smartness. Such digitally controlled systems feature many and functionally diverse interacting subsystems, system levels, and connections to other systems. Many problems arising during their design and operation can be traced back to a lack of understanding and consequential mismatches at their interfaces. Avoiding such fallacies, demands mutual explanations of behaviour comprehensible to other systems, designers, developers, and operators. Developers cannot be expected to understand all details of all components of a Cyber Physical System (CPS) and the environment. Yet software engineers, e.g., would need comprehensible explanations of expectations and guarantees of a physical process in the environment directly or indirectly interfacing with the program. In an ideal model-based design world, all the engineers would have to know would have been encoded in reasonable design specifications or contracts. Realistically, these are however partial and focused on precision, demanding understandable explanations for specific situations as a complement. The research training group CAUSE addresses these issues by striving to make digitally controlled systems self-explaining to developers, users, and other systems. While the term explanation and its relation to understanding and reasoning have been discussed intensively in many disciplines, we provide a definition that fits our view and needs to be later related to our formalisms: An explanation is information timely provided by one (self-explaining) system (the explainer) to another system (the addressee); the explanation empowers the addressee to unveil consequences from decisions or inconsistencies in knowledge remaining inaccessible otherwise; by means of the explanation, the addressee can draw these conclusions without knowing all details of the interaction. CAUSE focuses on the technical, logical, and algorithmic basis of self-explanation capabilities and their integration into system engineering methodology at all abstraction levels pertinent to digital system design. The research training group is unique in fostering and facilitating development of complementing theories and methods using a demonstrator covering all levels from digital hardware over software and systems to Systems of CPSs. The research training group spans three institutions implementing a consortium equally versed in hardware, software, and systems, as well as in theoretical foundations, development methods, and practical applications. Within CAUSE, doctoral researchers are immersed into state-of-the-art research concerning novel system design concepts enabling and building upon self-explanation. They develop scientific independence and acquire the timely skills of distributed team-work, working in varying collaborative contexts, adapting to diverse knowledge backgrounds, and intense technical communication.
DFG Programme
Research Training Groups
Applicant Institution
Technische Universität Hamburg
Co-Applicant Institution
Carl von Ossietzky Universität Oldenburg; Universität Bremen
Spokesperson
Professor Dr.-Ing. Görschwin Fey
Participating Researchers
Professor Michael Beetz, Ph.D.; Professor Dr.-Ing. Christian Dietrich; Professor Dr. Rolf Drechsler; Professor Dr. Heiko Falk; Professor Dr. Martin Fränzle; Professorin Dr.-Ing. Verena Klös; Professor Dr. Rainer Koschke; Professor Dr. Sebastian Lehnhoff; Professorin Dr. Sibylle Schupp; Professorin Dr. Heike Wehrheim