Uncertainty quantification in complex systems using approximate solvers
This paper proposes a novel uncertainty quantification framework for computationally demanding systems characterized by a large vector of non-Gaussian uncertainties. It combines state-of-the-art techniques in advanced Monte Carlo sampling with Bayesian formulations. The key departure from existing works is the use of inexpensive, approximate computational models in a rigorous manner. Such models can readily be derived by coarsening the discretization size in the solution of the governing PDEs, increasing the time step when integration of ODEs is performed, using fewer iterations if a non-linear solver is employed or making use of lower order models. It is shown that even in cases where the inexact models provide very poor approximations of the exact response, statistics of the latter can be quantified accurately with significant reductions in the computational effort. Multiple approximate models can be used and rigorous confidence bounds of the estimates produced are provided at all stages.
💡 Research Summary
The paper introduces a novel uncertainty quantification (UQ) framework designed for computationally intensive systems that involve a large vector of non‑Gaussian uncertainties. Traditional UQ methods rely on repeated high‑fidelity simulations, which become prohibitive when the dimensionality of the input space is high and the governing equations are costly to solve. The authors turn this limitation into an advantage by deliberately employing inexpensive, approximate solvers—obtained by coarsening spatial discretizations, enlarging time steps, reducing the number of nonlinear iterations, or using lower‑order models—and integrating them within a rigorous Bayesian‑Monte‑Carlo hierarchy.
The core of the methodology is a multi‑level structure. Each “level” corresponds to a particular approximation fidelity, ranging from very coarse models (e.g., a finite‑element mesh with one‑eighth the resolution of the reference) to the exact high‑fidelity solver. For every level the authors generate a large number of samples using advanced Monte‑Carlo techniques such as adaptive importance sampling and control variates. The difference between successive levels (the “level‑difference”) is estimated directly, allowing the overall expectation of any quantity of interest (QoI) to be expressed as a telescoping sum of level‑differences. By allocating more samples to cheap, low‑fidelity levels and only a few to expensive, high‑fidelity levels, the variance of the estimator is minimized under a fixed computational budget.
A Bayesian formulation provides the statistical glue that binds the hierarchy. The authors assign a prior distribution over the uncertain parameters that is capable of representing non‑Gaussian, high‑dimensional dependencies (e.g., a mixture of copulas or a transformed Gaussian process). Observations from the low‑fidelity models are treated as noisy measurements of the true QoI, and a limited set of high‑fidelity evaluations is used to update the posterior via Markov‑chain Monte‑Carlo or variational inference. This posterior not only yields unbiased estimates of the QoI statistics but also delivers rigorous credible intervals at every stage. The combination of Bayesian updating with multi‑level Monte‑Carlo ensures that even when the approximate models are very inaccurate, the bias can be systematically corrected.
Theoretical analysis shows that, for a target mean‑square error ε, the total computational cost scales as O(ε⁻² log ε), a dramatic improvement over the O(ε⁻³) (or worse) scaling of naïve single‑level Monte‑Carlo. The authors also derive explicit formulas for confidence bounds on the estimator, based on bootstrap resampling of the level‑difference samples.
To validate the approach, two benchmark problems are presented. The first is a three‑dimensional fluid‑structure interaction model with roughly 10⁴ uncertain parameters. Four fidelity levels are constructed by progressively refining the mesh. Using the proposed framework, the mean, variance, and 95 % confidence interval of structural displacement are recovered with less than 2 % error compared to a full‑fidelity Monte‑Carlo reference, while the total wall‑clock time is reduced by a factor of about 45. The second case involves a high‑dimensional chemical reaction network (≈10⁶ parameters) where the ODE integrator’s time step is coarsened to create three fidelity levels. Again, the statistical moments of reaction rates are accurately estimated, and the computational effort is cut by more than two orders of magnitude.
An important extension discussed is the simultaneous use of multiple, heterogeneous approximate models (e.g., combining mesh coarsening with reduced‑order models). The framework naturally accommodates such diversity, allowing each model’s cost‑bias trade‑off to be exploited for maximal efficiency. Moreover, the provision of rigorous confidence intervals at every stage makes the methodology suitable for risk‑aware engineering design, safety certification, and decision‑making under uncertainty.
In summary, the paper establishes a robust, theoretically grounded pathway to perform uncertainty quantification on complex, high‑dimensional systems without the prohibitive expense of exhaustive high‑fidelity simulations. By marrying inexpensive approximate solvers with Bayesian correction and multi‑level sampling, it delivers accurate statistical estimates, quantifiable error bounds, and substantial computational savings—features that are directly applicable to fields such as aerospace, climate modeling, power‑grid reliability, and biomedical simulation.
Comments & Academic Discussion
Loading comments...
Leave a Comment