Conference Proceedings
Approximating message lengths of hierarchical bayesian models using posterior sampling
DF Schmidt, E Makalic, JL Hopper
Lecture Notes in Computer Science Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics | Published : 2016
Abstract
Inference of complex hierarchical models is an increasingly common problem in modern Bayesian data analysis. Unfortunately, there are few computationally efficient and widely applicable methods for selecting between competing hierarchical models. In this paper we adapt ideas from the information theoretic minimum message length principle and propose a powerful yet simple model selection criteria for general hierarchical Bayesian models called MML-h. Computation of this criterion requires only that a set of samples from the posterior distribution be available. The flexibility of this new algorithm is demonstrated by a novel application to state-of-the-art Bayesian hierarchical regression esti..
View full abstract