A corrected AIC for the selection of seemingly unrelated regressions models

A corrected AIC for the selection of seemingly unrelated regressions   models
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

A bias correction to Akaike’s information criterion (AIC) is derived for seemingly unrelated regressions models. The correction is of particular use when the sample size is not much larger than the number of fitted parameters. A small-sample simulation study indicates that the bias-corrected AIC (AICc) provides better model choices than other model selection criteria.


💡 Research Summary

The paper addresses a well‑known limitation of Akaike’s Information Criterion (AIC) when applied to seemingly unrelated regressions (SUR) – namely, the tendency of AIC to favor overly complex models when the sample size (n) is not much larger than the total number of estimated parameters (p). SUR models, which jointly estimate multiple regression equations with correlated error terms, require estimation of both regression coefficients and the error covariance matrix. This extra layer of parameters amplifies the small‑sample bias inherent in the standard AIC formulation (‑2 log‑likelihood + 2p).

The authors derive a bias‑corrected version of AIC, called AICc, specifically for SUR models. Starting from the expected value of the AIC under the true model, they perform a second‑order Taylor expansion of the log‑likelihood around the maximum‑likelihood estimates. By incorporating the Fisher information matrix for both the regression coefficients and the covariance parameters, they obtain an explicit expression for the bias term. The resulting correction is

 AICc = ‑2 log L̂ + 2p +


Comments & Academic Discussion

Loading comments...

Leave a Comment