Project Details
Projekt Print View

Probabilistic Model Checking Under Partial Observability With Multiple Objectives

Subject Area Theoretical Computer Science
Term since 2023
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 520530521
 
Probabilistic model checking (PMC) is a branch of computer-aided verification focusing on the automated verification of probabilistic models. Markov decision processes (MDPs) are such a prominent probabilistic model. They have their roots in operations research and stochastic control theory and are frequently used for optimisation problems. PMC checks for a given MDP M and a property specification f whether M satisfies f. Under the hood, it uses graph algorithms, value or policy iteration, compact data structures, etc. so as to achieve a fully automated procedure. Effective abstraction, reduction, and symbolic techniques curb the ``curse of dimensionality'' problem. Model checkers such as PRISM and Storm can solve (many) MDPs with billions of states in a few minutes. An important advantage is that PMC automatically obtains an optimal policy for f as a by-product of the verification process. A major assumption in MDPs though is that states are fully observable. That is to say, given a finite sequence of states and actions taken so far (the ``history''), the current state of the MDP is uniquely determined. In many realistic scenarios --- e.g., a robot with sensors that cannot cover the entire environment or an attacker for which the system is a gray box --- this perfect information assumption is too severe. The aim of this project is to investigate the automated verification of probabilistic models under partial observability. We do so by gradually considering models of increasing difficulty: we first consider MDPs in which part of the states is observable and part is not, so-called mixed observable MDPs (MOMDPs). We then consider partially observable MDP (POMDPs) in which each state is partially observable. POMDPs are a prominent model in planning in AI. Finally, we consider partially observable stochastic games (POSGs), a generalisation of POMDPs with an extra player. Our primary focus is to consider specifications that consist of multiple objectives, e.g., target states should be reached with high probability while certainly avoiding dedicated ``bad'' states. As preparatory investigations, we will start off by considering some open questions concerning multiple objectives on just MDPs. Depending on the model at hand, we will consider various types of multiple objectives: mixtures of stochastic and non-stochastic (e.g. always/exist) objectives, multiple objectives under a lexicographic ordering, multiple total reward objectives, objectives on fully as well as partially observable states, and the like. Our investigations will focus on decidability and complexity as well as developing (approximate) algorithms, implementing those algorithms on top of the Storm model checker, and conducting some experimental evaluations.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung