Squeeze-and-Breathe Evolutionary Monte Carlo Optimisation with Local Search Acceleration and its application to parameter fitting

Squeeze-and-Breathe Evolutionary Monte Carlo Optimisation with Local   Search Acceleration and its application to parameter fitting
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Motivation: Estimating parameters from data is a key stage of the modelling process, particularly in biological systems where many parameters need to be estimated from sparse and noisy data sets. Over the years, a variety of heuristics have been proposed to solve this complex optimisation problem, with good results in some cases yet with limitations in the biological setting. Results: In this work, we develop an algorithm for model parameter fitting that combines ideas from evolutionary algorithms, sequential Monte Carlo and direct search optimisation. Our method performs well even when the order of magnitude and/or the range of the parameters is unknown. The method refines iteratively a sequence of parameter distributions through local optimisation combined with partial resampling from a historical prior defined over the support of all previous iterations. We exemplify our method with biological models using both simulated and real experimental data and estimate the parameters efficiently even in the absence of a priori knowledge about the parameters.


💡 Research Summary

The paper addresses the challenging problem of estimating parameters for biological models, where data are often sparse, noisy, and the underlying dynamical systems are nonlinear ordinary differential equations (ODEs). Classical least‑squares or linearisation techniques fail in such settings, while standard evolutionary algorithms (EA), simulated annealing (SA), or sequential Monte Carlo (SMC) methods either converge slowly or require a prior that already contains the true parameter region. To overcome these limitations, the authors propose a novel optimisation framework called “Squeeze‑and‑Breathe” (SB) that synergistically combines three ideas: (1) a global evolutionary‑style sampling of parameter vectors, (2) a local direct‑search optimisation (e.g., Nelder‑Mead simplex) applied to each sampled point, and (3) a Bayesian‑inspired mixture of the newly obtained posterior with a “historical prior” that aggregates the support of all previous posteriors.

The algorithm proceeds iteratively. At iteration k, a population of J parameter vectors is drawn from the current prior π_{k‑1}. Each vector is locally minimised using a deterministic local optimiser L(·), producing a set of local minima. These minima are ranked by the error function E_D (typically a sum‑of‑squares between model predictions and experimental observations) and the B best are retained; they constitute a sample from the posterior distribution 𝛤_k (the “squeeze” step). The posterior is then blended with the historical prior ζ_{k‑1} using a mixing weight p_m (0 < p_m ≤ 1) to form the new prior π_k (the “breathe” step). The historical prior is updated by taking the union of the supports of the current posterior and the previous historical prior and redefining a uniform distribution over that union, which guarantees that the algorithm can explore parameter values that lie outside the original prior bounds.

Convergence is declared when two criteria are simultaneously satisfied: (i) the change in mean error between successive posteriors falls below a user‑specified tolerance (φ_k < Tol), and (ii) a non‑parametric Mann‑Whitney test indicates that samples from consecutive posteriors are statistically indistinguishable. This dual criterion prevents premature termination and ensures that the posterior distribution has stabilised.

The authors demonstrate the method on several case studies. A simple two‑parameter “BPM” model illustrates the full workflow: starting from a uniform prior on


Comments & Academic Discussion

Loading comments...

Leave a Comment