Random-Bridges as Stochastic Transports for Generative Models
This paper motivates the use of random-bridges – stochastic processes conditioned to take target distributions at fixed timepoints – in the realm of generative modelling. Herein, random-bridges can act as stochastic transports between two probability distributions when appropriately initialized, and can display either Markovian or non-Markovian, and either continuous, discontinuous or hybrid patterns depending on the driving process. We show how one can start from general probabilistic statements and then branch out into specific representations for learning and simulation algorithms in terms of information processing. Our empirical results, built on Gaussian random bridges, produce high-quality samples in significantly fewer steps compared to traditional approaches, while achieving competitive Frechet inception distance scores. Our analysis provides evidence that the proposed framework is computationally cheap and suitable for high-speed generation tasks.
💡 Research Summary
The paper introduces a novel generative modeling framework based on random bridges, stochastic processes that are conditioned to hit prescribed target distributions at fixed times. Unlike conventional diffusion models such as Denoising Diffusion Probabilistic Models (DDPM), which rely on a bidirectional noising‑denoising schedule and assume a Gaussian prior, the proposed approach builds a single‑directional stochastic transport from a source distribution Φ to a target distribution Ψ.
Core theoretical contribution
The authors define a (Φ, Ψ)‑bridge as an adapted process ξₜ satisfying two conditions: (i) the joint law of its endpoints (ξ₀, ξ_T) matches a coupling Γ(Φ, Ψ); (ii) for any finite collection of intermediate times, the conditional law of ξ at those times given the endpoints coincides with that of an arbitrary driving process Zₜ (a càdlàg process) conditioned on the same endpoints. This definition is deliberately model‑agnostic; it does not presuppose Markovianity, continuity, or a specific noise family. Proposition 2.4 shows that ξ and Z share the same finite‑dimensional distributions iff (Z₀, Z_T)∼Γ(Φ, Ψ). Moreover, Proposition 2.6 establishes that ξ inherits the Markov property of Z, enabling efficient computation when Z is Markovian.
Gaussian bridge specialization
For practical implementation the authors focus on Gaussian driving processes. With mean μₜ and covariance Σ_{s,t}, they derive an anticipative representation:
ξₜ = Σ*{t,T} Y + (Zₜ − Σ*{t,T} Z_T), where Σ*{t,T}=Σ{t,T}Σ^{-1}{T,T} and Y∼Ψ. This expression can be interpreted as a noisy information channel: Y is the clean signal, while the term (Zₜ − Σ*{t,T} Z_T) acts as time‑dependent noise that vanishes at t = T. The representation is “anticipative” because it involves the future endpoint Z_T, mirroring the conditioning inherent in bridge processes.
Learning and sampling algorithm
Training proceeds by (1) sampling x∼Φ, (2) simulating the bridge ξₓₜ using the Gaussian formula, (3) feeding ξₓₜ into a neural network f_θ that predicts the conditional expectation E
Comments & Academic Discussion
Loading comments...
Leave a Comment