Cover image
research

Parameter elimination in particle Gibbs sampling

A. Wigren, R.S. Risuleo, L.M. Murray and F. Lindsten

Bayesian inference in state-space models is challenging due to high-dimensional state trajectories. A viable approach is particle Markov chain Monte Carlo, combining MCMC and sequential Monte Carlo to form “exact approximations” to otherwise intractable MCMC methods. The performance of the approximation is limited to that of the exact method. We focus on particle Gibbs and particle Gibbs with ancestor sampling, improving their performance beyond that of the underlying Gibbs sampler (which they approximate) by marginalizing out one or more parameters. This is possible when the parameter prior is conjugate to the complete data likelihood. Marginalization yields a non-Markovian model for inference, but we show that, in contrast to the general case, this method still scales linearly in time. While marginalization can be cumbersome to implement, recent advances in probabilistic programming have enabled its automation. We demonstrate how the marginalized methods are viable as efficient inference backends in probabilistic programming, and demonstrate with examples in ecology and epidemiology.

A. Wigren, R.S. Risuleo, L.M. Murray and F. Lindsten (2019). Parameter elimination in particle Gibbs sampling. Advances in Neural Information Processing Systems.

A. Wigren, R.S. Risuleo, L.M. Murray and F. Lindsten (2019). <a href="https://indii.org/research/parameter-elimination-in-particle-gibbs-sampling/">Parameter elimination in particle Gibbs sampling</a>. <em>Advances in Neural Information Processing Systems</em>. 

@Article{,
  title = {Parameter elimination in particle Gibbs sampling},
  author = {Anna Wigren and Riccardo Sven Risuleo and Lawrence M. Murray and Fredrik Lindsten},
  journal = {Advances in Neural Information Processing Systems},
  year = {2019},
  url = {https://arxiv.org/abs/1910.14145}
}