Gibbs Sampling for a Bayesian Hierarchical General Linear Model
We consider a Bayesian hierarchical version of the normal theory general linear model which is practically relevant in the sense that it is general enough to have many applications and it is not straightforward to sample directly from the corresponding posterior distribution. Thus we study a block Gibbs sampler that has the posterior as its invariant distribution. In particular, we establish that the Gibbs sampler converges at a geometric rate. This allows us to establish conditions for a central limit theorem for the ergodic averages used to estimate features of the posterior. Geometric ergodicity is also a key component for using batch means methods to consistently estimate the variance of the asymptotic normal distribution. Together, our results give practitioners the tools to be as confident in inferences based on the observations from the Gibbs sampler as they would be with inferences based on random samples from the posterior. Our theoretical results are illustrated with an application to data on the cost of health plans issued by health maintenance organizations.
💡 Research Summary
The paper tackles the problem of drawing inference from a Bayesian hierarchical version of the normal‑theory general linear model (GLM). While the hierarchical formulation provides a flexible framework for incorporating prior uncertainty on regression coefficients, error variance, and hyper‑parameters, the resulting posterior distribution is high‑dimensional and does not belong to a standard family, making direct sampling infeasible. To address this, the authors construct a block Gibbs sampler that cycles through the full conditional distributions of the model parameters: the regression vector β, the error variance σ², and the hyper‑variance τ² (and optionally additional hyper‑parameters). Each conditional distribution is either multivariate normal (for β) or inverse‑gamma (for σ² and τ²), allowing exact draws without Metropolis‑Hastings proposals.
The central theoretical contribution is a rigorous proof that this Gibbs sampler is geometrically ergodic. By establishing a drift condition with a suitable Lyapunov function V(θ)=1+‖β‖²+σ⁻²+τ⁻² and a minorization condition on a compact set where V is bounded, the authors show that the Markov chain contracts toward its stationary distribution at a geometric rate. This property is crucial because it guarantees that the chain forgets its starting point exponentially fast, which in turn validates the use of asymptotic results for ergodic averages.
Leveraging geometric ergodicity, the paper derives a central limit theorem (CLT) for ergodic averages of any square‑integrable functional g(θ). Specifically, √n( \bar{g}_n – Eπ
Comments & Academic Discussion
Loading comments...
Leave a Comment