Particle Filters and Markov Chains for Learning of Dynamical Systems (2013)
Abstract / truncated to 115 words
Sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC) methods provide computational tools for systematic inference and learning in complex dynamical systems, such as nonlinear and non-Gaussian state-space models. This thesis builds upon several methodological advances within these classes of Monte Carlo methods. Particular emphasis is placed on the combination of SMC and MCMC in so called particle MCMC algorithms. These algorithms rely on SMC for generating samples from the often highly autocorrelated state-trajectory. A specific particle MCMC algorithm, referred to as particle Gibbs with ancestor sampling (PGAS), is suggested. By making use of backward sampling ideas, albeit implemented in a forward-only fashion, PGAS enjoys good mixing even when using seemingly few particles in ...
bayesian learning – system identification – sequential monte carlo – markov chain monte carlo – particle mcmc – particle filters – particle smoothers
Information
- Author
- Lindsten, Fredrik
- Institution
- Linköping University
- Supervisors
- Publication Year
- 2013
- Upload Date
- Oct. 31, 2013
The current layout is optimized for mobile phones. Page previews, thumbnails, and full abstracts will remain hidden until the browser window grows in width.
The current layout is optimized for tablet devices. Page previews and some thumbnails will remain hidden until the browser window grows in width.