Skip to main content
Department of Information Technology

Stochastic Infinite State Systems

Current members: Parosh Abdulla, Noomene Ben Henda, Sven Sandberg

Stochastic infinite state systems are natural models for systems which exhibit non-deterministic or randomized behavior. Model checking such systems is a challenging task since they are infinite-state and thus model checking is undecidable in general; and they are stochastic and thus give rise to a wider class of problems. In our work, the semantics of infinite state stochastic systems is that of 2-1/2 palyer simple stochastic games. A 2-1/2 player simple stochastic game is an inifinte state transition system where the set of states is partionned into player 0 states, player 1 states and player random states. Transitions leaving player 0 or 1 states are non-deterministic while those leaving random states are subject to predefined probability distributions. The class of 1-1/2 player games (where the set of player 1 states is empty) is exactly that of Markov decision processes while the class of 1/2 player games (no player 0 and 1 states) is equal to the class of Markov chains.

Contributions

Model checking infinite state stochastic systems gives rise to two subclasses of problems:

  • Qualitative Analysis: These are problems where the question is wether a property holds with probability one (almost-surely) or zero (almost-never).

We identify the class of decisive Markov chains for which several qualitative analysis problems are decidable. We show that lossy channel systems (LCS), probabilistic petri nets (PVASS) and noisy turing machines (NTM) all induce decisive Markov chains. For LCS and NTM, we prove that qualitative analysis of simple and repeated reachability are decidable while for PVASS we show that almost-never simple reachability is undecidable and prove the decidability of almost-surely, simple and repeated, reachability problems.

In a recent work, we introduce the model of game lossy channel systems (GLCS) where the control states are partionned among 2 players. We show that 2-1/2 simple stochastic games induced by GLCS where the winning condition is almost-surely, simple or repeated, rechability are determined and solvable in memoryless strategies.

  • Quantitative Analysis: These are problems where the task is to compute or approximate up to a threshold, probabilities, expectations, gain, steady state distributions...

We prove that for Markov chains induced by LCS, PVASS and NTM, one cannot express the probabilities in the first order theory of the real numbers as it is the case for pushdown automata. Nevertheless, we propose algorithms to approximate simple and repeated reachability probabilities up to an arbitrary error. The algorithms are based on path exploration schemes. We show that terminations follow from decisiveness of the induced Markov chains.

In another work, we identify the class of eager Markov chains for which we can approximate expectations of random variables. We show that LCS, PVASS and NTM induce such Markov chains. A smaller class of eager Markov chains namely that of MCs with eager attractor allow further analysis. For this particular class, we propose algorithms to approximate gains and steady state distributions.

Publications

You can find here a list of our published work.

Updated  2007-04-24 01:29:47 by Noomene Ben Henda.