MALA-within-Gibbs samplers for high-dimensional distributions with sparse conditional structure
Markov chain Monte Carlo (MCMC) samplers are numerical methods for drawing samples from a given target probability distribution. We discuss one particular MCMC sampler, the MALA-within-Gibbs sampler, from the theoretical and practical perspectives. We first show that the acceptance ratio and step si...
Saved in:
Main Authors: | , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
25-08-2019
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Markov chain Monte Carlo (MCMC) samplers are numerical methods for drawing
samples from a given target probability distribution. We discuss one particular
MCMC sampler, the MALA-within-Gibbs sampler, from the theoretical and practical
perspectives. We first show that the acceptance ratio and step size of this
sampler are independent of the overall problem dimension when (i) the target
distribution has sparse conditional structure, and (ii) this structure is
reflected in the partial updating strategy of MALA-within-Gibbs. If, in
addition, the target density is block-wise log-concave, then the sampler's
convergence rate is independent of dimension. From a practical perspective, we
expect that MALA-within-Gibbs is useful for solving high-dimensional Bayesian
inference problems where the posterior exhibits sparse conditional structure at
least approximately. In this context, a partitioning of the state that
correctly reflects the sparse conditional structure must be found, and we
illustrate this process in two numerical examples. We also discuss trade-offs
between the block size used for partial updating and computational requirements
that may increase with the number of blocks. |
---|---|
DOI: | 10.48550/arxiv.1908.09429 |