16 December 2016Abstract:
Dynamical behavior can be seen in many real-life phenomena, typically as a dependence over time. This thesis studies and develops methods and probabilistic models for statistical learning of such dynamical phenomena.
A probabilistic model is a mathematical model expressed using probability theory. Statistical learning amounts to constructing such models, as well as adjusting them to data recorded from real-life phenomena. The resulting models can be used for, e.g., drawing conclusions about the phenomena under study and making predictions.
The methods in this thesis are primarily based on the particle filter and its generalizations, sequential Monte Carlo (SMC) and particle Markov chain Monte Carlo (PMCMC). The model classes considered are nonlinear state-space models and Gaussian processes.
The following contributions are included. Starting with a Gaussian-process state-space model, a general, flexible and computationally feasible nonlinear state-space model is derived in Paper I. In Paper II, a benchmark is performed between the two alternative state-of-the-art methods SMCs and PMCMC. Paper III considers PMCMC for solving the state-space smoothing problem, in particular for an indoor positioning application. In Paper IV, SMC is used for marginalizing the hyperparameters in the Gaussian-process state-space model, and Paper V is concerned with learning of jump Markov linear state-space models. In addition, the thesis also contains an introductory overview covering statistical inference, state-space models, Gaussian processes and some advanced Monte Carlo methods, as well as two appendices summarizing some useful technical results.
Available as PDF (10.47 MB)
Download BibTeX entry.