Neural Thermodynamics: Entropic Forces in Deep and Universal Representation Learning

Neural Thermodynamics: Entropic Forces in Deep and Universal Representation Learning
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

With the rapid discovery of emergent phenomena in deep learning and large language models, understanding their cause has become an urgent need. Here, we propose a rigorous entropic-force theory for understanding the learning dynamics of neural networks trained with stochastic gradient descent (SGD) and its variants. Building on the theory of parameter symmetries and an entropic loss landscape, we show that representation learning is crucially governed by emergent entropic forces arising from stochasticity and discrete-time updates. These forces systematically break continuous parameter symmetries and preserve discrete ones, leading to a series of gradient balance phenomena that resemble the equipartition property of thermal systems. These phenomena, in turn, (a) explain the universal alignment of neural representations between AI models and lead to a proof of the Platonic Representation Hypothesis, and (b) reconcile the seemingly contradictory observations of sharpness- and flatness-seeking behavior of deep learning optimization. Our theory and experiments demonstrate that a combination of entropic forces and symmetry breaking is key to understanding emergent phenomena in deep learning.


💡 Research Summary

The paper “Neural Thermodynamics: Entropic Forces in Deep and Universal Representation Learning” proposes a rigorous theoretical framework that links the stochastic dynamics of stochastic gradient descent (SGD) and its variants to concepts from statistical physics, in particular entropic forces and symmetry breaking. The authors start by defining the usual empirical risk L(θ) = E


Comments & Academic Discussion

Loading comments...

Leave a Comment