Equations of States in Statistical Learning for a Nonparametrizable and Regular Case
Many learning machines that have hierarchical structure or hidden variables are now being used in information science, artificial intelligence, and bioinformatics. However, several learning machines used in such fields are not regular but singular statistical models, hence their generalization performance is still left unknown. To overcome these problems, in the previous papers, we proved new equations in statistical learning, by which we can estimate the Bayes generalization loss from the Bayes training loss and the functional variance, on the condition that the true distribution is a singularity contained in a learning machine. In this paper, we prove that the same equations hold even if a true distribution is not contained in a parametric model. Also we prove that, the proposed equations in a regular case are asymptotically equivalent to the Takeuchi information criterion. Therefore, the proposed equations are always applicable without any condition on the unknown true distribution.
💡 Research Summary
The paper addresses a fundamental gap in statistical learning theory concerning the estimation of generalization performance for complex learning machines that are either singular (due to hierarchical structures or hidden variables) or do not contain the true data‑generating distribution within their parametric family. In earlier work the authors introduced the “equations of states” (EOS) for Bayesian learning, a set of asymptotic relations linking the Bayes training loss (L_n), the Bayes generalization loss (G_n), and the functional variance (V_n): \
Comments & Academic Discussion
Loading comments...
Leave a Comment