Monotonic warpings for additive and deep Gaussian processes

Monotonic warpings for additive and deep Gaussian processes
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Gaussian processes (GPs) are canonical as surrogates for computer experiments because they enjoy a degree of analytic tractability. But that breaks when the response surface is constrained, say to be monotonic. Here, we provide a mono-GP construction for a single input that is highly efficient even though the calculations are non-analytic. Key ingredients include transformation of a reference process and elliptical slice sampling. We then show how mono-GP may be deployed effectively in two ways. One is additive, extending monotonicity to more inputs; the other is as a prior on injective latent warping variables in a deep Gaussian process for (non-monotonic, multi-input) non-stationary surrogate modeling. We provide illustrative and benchmarking examples throughout, showing that our methods yield improved performance over the state-of-the-art on examples from those two classes of problems.


💡 Research Summary

This paper tackles the problem of imposing monotonicity on Gaussian‑process (GP) surrogates, a requirement that frequently arises in computer‑experiment modeling (e.g., material‑property curves, dose‑response relationships). Classical GP theory offers closed‑form posterior updates, but those analytic conveniences disappear once a monotonicity constraint is enforced. The authors therefore propose a novel “mono‑GP” construction that guarantees monotonicity by design while still allowing efficient Bayesian inference via elliptical slice sampling (ESS).

The core idea begins with a single‑dimensional input domain (


Comments & Academic Discussion

Loading comments...

Leave a Comment