Conference Proceedings
A criterion for vector autoregressive model selection based on Kullback's symmetric divergence
AK Seghouane
ICASSP IEEE International Conference on Acoustics Speech and Signal Processing Proceedings | IEEE | Published : 2005
Abstract
The Kullback Information Criterion, KIC and its univariate bias-corrected version, KICc are two recently developed criteria for model selection. In this paper, a small sample model selection criterion for vector autoregressive models is developed. The proposed criterion is named KIC vc, where the notation "ve" stands for vector correction, and it can be considered as an extension of KIC for vector auloregressive models. KICvc is an unbiased estimator of a variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfilled. Simulation resulls shows thal the proposed criterion estimates the model order more accurately than any other asymptotically ef..
View full abstract