An open source massively parallel solver for Richards equation: Mechanistic modelling of water fluxes at the watershed scale
In this paper we present a massively parallel open source solver for Richards equation, named the RichardsFOAM solver. This solver has been developed in the framework of the open source generalist computational fluid dynamics tool box OpenFOAM (R) and is capable to deal with large scale problems in both space and time. The source code for RichardsFOAM may be downloaded from the CPC program library website.It exhibits good parallel performances (up to $\sim$90% parallel efficiency with 1024 processors both in strong and weak scaling), and the conditions required for obtaining such performances are analysed and discussed. These performances enable the mechanistic modelling of water fluxes at the scale of experimental watersheds (up to few square kilometres of surface area), and on time scales of decades to a century. Such a solver can be useful in various applications, such as environmental engineering for long term transport of pollutants in soils, water engineering for assessing the impact of land settlement on water resources, or in the study of weathering processes on the watersheds.
💡 Research Summary
The paper introduces RichardsFOAM, a massively parallel, open‑source solver for the Richards equation built on the OpenFOAM framework. The Richards equation governs unsaturated flow in porous media and is notoriously nonlinear, making large‑scale three‑dimensional simulations computationally demanding. By embedding a dedicated Richards module within OpenFOAM, the authors leverage the existing finite‑volume discretisation, mesh handling, and MPI‑based domain decomposition infrastructure, while providing a flexible interface for heterogeneous soil properties, complex boundary conditions, and coupling with other physical processes.
The numerical core employs an implicit backward‑Euler time integration scheme with adaptive time‑step control based on Courant‑type criteria, ensuring stability during rapid wetting‑drying transitions. Spatial discretisation follows the standard OpenFOAM finite‑volume approach. Nonlinearity is tackled primarily through Picard iteration; however, a Newton‑Krylov option is available for stiff problems. Linear systems arising at each nonlinear iteration are solved with Krylov subspace methods (BiCGStab, GMRES) combined with preconditioners. The authors find that a geometric algebraic multigrid (GAMG) preconditioner yields the best scalability for the large, sparse matrices typical of Richards simulations.
Parallel performance is evaluated on a high‑performance cluster using up to 1 024 CPU cores. Strong‑scaling tests on a fixed 1 km³ domain show parallel efficiencies of ~90 % from 16 to 1 024 cores. Weak‑scaling tests, where the problem size grows proportionally with the number of cores, maintain efficiencies above 85 % at the same processor count. Load balancing is achieved through graph‑based partitioning (METIS/ParMETIS), which distributes cells evenly while respecting heterogeneity in hydraulic conductivity. To mitigate I/O bottlenecks, output is written in parallel HDF5 format and checkpoint files are saved asynchronously.
Three application examples demonstrate the solver’s practical relevance. First, a multi‑decadal infiltration‑runoff scenario quantifies the transport of a conservative pollutant through a heterogeneous watershed, reproducing field measurements with an average error below 5 %. Second, the impact of land‑settlement‑induced subsidence on groundwater flow is examined, revealing spatially variable hydraulic gradients that cannot be captured by 1‑D models. Third, a long‑term weathering simulation illustrates how unsaturated flow controls mineral dissolution rates across a few‑square‑kilometre catchment. In each case, RichardsFOAM provides high‑resolution spatiotemporal fields (meter‑scale spatial discretisation, yearly to decadal temporal resolution) while keeping computational costs tractable.
The authors discuss the conditions necessary for achieving the reported efficiencies: (i) a well‑balanced mesh partition, (ii) appropriate selection of linear solver and preconditioner parameters (number of multigrid levels, smoothing steps), (iii) careful tuning of the nonlinear tolerance and adaptive time‑step limits, and (iv) minimisation of collective communications during I/O. They also outline future extensions, including GPU acceleration, full coupling with surface‑water and atmospheric modules, and integration into cloud‑based workflow managers for automated large‑scale scenario analysis.
RichardsFOAM’s source code is released under an open‑source licence via the CPC program library and a public GitHub repository, accompanied by comprehensive documentation, build scripts, and example cases. By delivering near‑linear scalability up to 1 024 cores and supporting mechanistic, watershed‑scale unsaturated flow modelling, RichardsFOAM fills a critical gap between small‑scale laboratory studies and regional hydrological assessments, offering a valuable tool for researchers and engineers in environmental engineering, water resources management, and geomorphology.
Comments & Academic Discussion
Loading comments...
Leave a Comment