Entropy-Reservoir Bregman Projection: An Information-Geometric Unification of Model Collapse

Reading time: 1 minute
...

📝 Original Info

  • Title: Entropy-Reservoir Bregman Projection: An Information-Geometric Unification of Model Collapse
  • ArXiv ID: 2512.14879
  • Date: 2025-12-16
  • Authors: Jingwei Chen

📝 Abstract

Self-referential learning-training a model on data it generated itself-promises boundless scalability but chronically suffers from model collapse: language models degenerate into repetitive text, GANs drop modes, and reinforcement-learning policies over-exploit. Although practitioners employ ad hoc fixes such as real-data mixing, entropy bonuses, knowledge distillation, or retrieval-augmented generation, a single principle that explains both the failure mode and the success of these fixes has remained elusive. We present Entropy-Reservoir Bregman Projection (ERBP), an information-geometric framework that unifies these phenomena. We model the closed loop as a stochastic Bregman projection sequence in distribution space. Without external coupling, finite-sample noise forces the system to project onto an ever-shrinking empirical support, causing exponential entropy decay and eventual collapse. Introducing an Entropy Reservoir-a high-entropy distribution mixed into each projection-injects a controllable entropy flux that provably stabilises the dynamics. Our theory yields (i) a necessary condition for collapse, (ii) a sufficient condition that guarantees a non-trivial entropy floor, and (iii) closedform rates that depend only on sample size and the st...

📄 Full Content

...(본문 내용이 길어 생략되었습니다. 사이트에서 전문을 확인해 주세요.)

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut