Journal article

The AIC criterion and symmetrizing the Kullback-Leibler divergence

Abd-Krim Seghouane, Shun-Ichi Amari

IEEE Transactions on Neural Networks and Learning Systems | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | Published : 2007

Abstract

The Akaike information criterion (AIC) is a widely used tool for model selection. AIC is derived as an asymptotically unbiased estimator of a function used for ranking candidate models which is a variant of the Kullback-Leibler divergence between the true model and the approximating candidate model. Despite the Kullback-Leibler's computational and theoretical advantages, what can become inconvenient in model selection applications is their lack of symmetry. Simple examples can show that reversing the role of the arguments in the Kullback-Leibler divergence can yield substantially different results. In this paper, three new functions for ranking candidate models are proposed. These functions ..

View full abstract

University of Melbourne Researchers