GLASS Flows: Transition Sampling for Alignment of Flow and Diffusion Models

GLASS Flows: Transition Sampling for Alignment of Flow and Diffusion Models
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

The performance of flow matching and diffusion models can be greatly improved at inference time using reward alignment algorithms, yet efficiency remains a major limitation. While several algorithms were proposed, we demonstrate that a common bottleneck is the sampling method these algorithms rely on: many algorithms require to sample Markov transitions via SDE sampling, which is significantly less efficient and often less performant than ODE sampling. To remove this bottleneck, we introduce GLASS Flows, a new sampling paradigm that simulates a “flow matching model within a flow matching model” to sample Markov transitions. As we show in this work, this “inner” flow matching model can be retrieved from a pre-trained model without any re-training, combining the efficiency of ODEs with the stochastic evolution of SDEs. On large-scale text-to-image models, we show that GLASS Flows eliminate the trade-off between stochastic evolution and efficiency. Combined with Feynman-Kac Steering, GLASS Flows improve state-of-the-art performance in text-to-image generation, making it a simple, drop-in solution for inference-time scaling of flow and diffusion models.


💡 Research Summary

The paper introduces GLASS Flows, a novel sampling paradigm that enables efficient reward‑aligned inference for flow‑matching and diffusion models without resorting to stochastic differential equation (SDE) sampling. Existing reward‑alignment methods (e.g., Sequential Monte Carlo, search‑based guidance) rely on drawing samples from the Markov transition kernel pₜ′|ₜ, which is naturally provided by the time‑reversed SDE. While SDE sampling supplies the necessary stochasticity for branching and exploration, it is considerably slower and incurs higher discretization error than ordinary differential equation (ODE) sampling, making it a bottleneck for large‑scale models.

GLASS Flows resolves this dilemma by constructing an “inner flow‑matching model” inside a pre‑trained flow‑matching network. Using the concept of a sufficient statistic, the authors derive a deterministic ODE in an auxiliary time variable s∈


Comments & Academic Discussion

Loading comments...

Leave a Comment