Abstract

Bayesian methods have been widely used in the last two decades to infer statistical properties of spatially variable coefficients in partial differential equations from measurements of the solutions of these equations. Yet, in many cases the number of variables used to parameterize these coefficients is large, and oobtaining meaningful statistics of their probability distributions is difficult using simple sampling methods such as the basic Metropolis--Hastings algorithm---in particular, if the inverse problem is ill-conditioned or ill-posed. As a consequence, many advanced sampling methods have been described in the literature that converge faster than Metropolis--Hastings, for example, by exploiting hierarchies of statistical models or hierarchies of discretizations of the underlying differential equation.
At the same time, it remains difficult for the reader of the literature to quantify the advantages of these algorithms because there is no commonly used benchmark. This paper presents a benchmark Bayesian inverse problem---namely, the determination of a spatially variable coefficient, discretized by 64 values, in a Poisson equation, based on point measurements of the solution---that fills the gap between widely used simple test cases (such as superpositions of Gaussians) and real applications that are difficult to replicate for developers of sampling algorithms. We provide a complete description of the test case and provide an open-source implementation that can serve as the basis for further experiments. We have also computed $2\times 10^{11}$ samples, at a cost of some 30 CPU years, of the posterior probability distribution from which we have generated detailed and accurate statistics against which other sampling algorithms can be tested.

Keywords

  1. Bayesian inference
  2. inverse problems
  3. partial differential equations
  4. Markov chain Monte Carlo
  5. benchmarking

MSC codes

  1. 65N21
  2. 35R30
  3. 74G75

Get full access to this article

View all available purchase options and get full access to this article.

References

1.
B. M. Adams et al., Dakota, a Multilevel Parallel Object-Oriented Framework for Design Optimization, Parameter Estimation, Uncertainty Quantification, and Sensitivity Analysis: Version \textup6.13 User's Manual, Tech. Report SAND2020-12495, Sandia National Laboratories, 2020.
2.
O. Aguilar, M. Allmaras, W. Bangerth, and L. Tenorio, Statistics of parameter estimates: A concrete example, SIAM Rev., 57 (2015), pp. 131--149, https://doi.org/10.1137/130929230.
3.
M. Allmaras, W. Bangerth, J. M. Linhart, J. Polanco, F. Wang, K. Wang, J. Webster, and S. Zedler, Estimating parameters in physical models through Bayesian inversion: A complete example, SIAM Rev., 55 (2013), pp. 149--167, https://doi.org/10.1137/100788604.
4.
D. Arndt, W. Bangerth, D. Davydov, T. Heister, L. Heltai, M. Kronbichler, M. Maier, J.-P. Pelteret, B. Turcksin, and D. Wells, The \textttdeal.II finite element library: Design, features, and insights, Comput. Math. Appl., 81 (2021), pp. 407--422, https://doi.org/10.1016/j.camwa.2020.02.022, https://arxiv.org/abs/1910.13247.
5.
D. Arndt, W. Bangerth, M. Feder, M. Fehling, R. Gassmöller, T. Heister, L. Heltai, M. Kronbichler, M. Maier, P. Munch, J.-P. Pelteret, S. Sticko, B. Turcksin, and D. Wells, The \textttdeal.II library, version \textup9.4, J. Numer. Math., 30 (2022), pp. 231--246, https://doi.org/10.1515/jnma-2022-0054.
6.
Y. F. Atchadé and J. S. Rosenthal, On adaptive Markov chain Monte Carlo algorithms, Bernoulli, 11 (2005), pp. 815--828.
7.
J. M. Bardsley, Computational Uncertainty Quantification for Inverse Problems, SIAM, 2018, https://doi.org/10.1137/1.9781611975383.
8.
J. M. Bardsley, A. Seppänen, A. Solonen, H. Haario, and J. Kaipio, Randomize-then-optimize for sampling and uncertainty quantification in electrical impedance tomography, SIAM/ASA J. Uncertain. Quantif., 3 (2015), pp. 1136--1158, https://doi.org/10.1137/140978272.
9.
J. M. Bardsley, A. Solonen, H. Haario, and M. Laine, Randomize-then-optimize: A method for sampling from posterior distributions in nonlinear inverse problems, SIAM J. Sci. Comput., 36 (2014), pp. A1895--A1910, https://doi.org/10.1137/140964023.
10.
M. Bédard, Optimal acceptance rates for Metropolis algorithms: Moving beyond $0.234$, Stochast. Process. Appl., 118 (2008), pp. 2198--2222, https://doi.org/10.1016/j.spa.2007.12.005.
11.
J. Bierkens, P. Fearnhead, and G. O. Roberts, The zig-zag process and super-efficient sampling for Bayesian analysis of big data, Ann. Statist., 47 (2019), pp. 1288--1320.
12.
T. Bui-Thanh and O. Ghattas, Analysis of the Hessian for inverse scattering problems: II. Inverse medium scattering of acoustic waves, Inverse Problems, 28 (2012), art. 055002, https://doi.org/10.1088/0266-5611/28/5/055002.
13.
P. Chen, U. Villa, and O. Ghattas, Taylor approximation and variance reduction for PDE-constrained optimal control under uncertainty, J. Comput. Phys., 385 (2019), pp. 163--186, https://doi.org/10.1016/j.jcp.2019.01.047.
14.
J. A. Christen and C. Fox, Markov chain Monte Carlo using an approximation, J. Comput. Graphic. Statist., 14 (2005), pp. 795--810, https://doi.org/10.1198/106186005X76983.
15.
I. Craig and J. Brown, Inverse problems in astronomy, in Bayesian Astrophysics, A. Asensio Ramos and I. Arregui, eds., Cambridge University Press, 1986, pp. 31--61, https://doi.org/10.1017/9781316182406.003.
16.
E. Darve, D. Rodríguez-Gómez, and A. Pohorille, Adaptive biasing force method for scalar and vector free energy calculations, J. Chem. Phys., 128 (2008), art. 144120.
17.
M. Dashti and A. M. Stuart, The Bayesian Approach to Inverse Problems, in Handbook of Uncertainty Quantification, R. Ghanem, D. Higdon, and H. Owhadi, eds., Springer, Cham, 2017, pp. 311--428, https://doi.org/10.1007/978-3-319-12385-1_7.
18.
T. A. Davis, Algorithm $832$, ACM Trans. Math. Software, 30 (2004), pp. 196--199, https://doi.org/10.1145/992200.992206.
19.
B. Debusschere, K. Sargsyan, C. Safta, and K. Chowdhary, Uncertainty quantification toolkit (UQTk), in Handbook of Uncertainty Quantification, R. Ghanem, D. Higdon, and H. Owhadi, eds., Springer, Cham, 2017, pp. 1807--1827, https://doi.org/10.1007/978-3-319-12385-1_56.
20.
T. J. Dodwell, C. Ketelsen, R. Scheichl, and A. L. Teckentrup, A hierarchical multilevel Markov chain Monte Carlo algorithm with applications to uncertainty quantification in subsurface flow, SIAM/ASA J. Uncertain. Quantif., 3 (2015), pp. 1075--1108, https://doi.org/10.1137/130915005.
21.
A. B. Duncan, T. Lelievre, and G. A. Pavliotis, Variance reduction using nonreversible Langevin samplers, J. Statist. Phys., 163 (2016), pp. 457--491.
22.
H. P. Flath, L. C. Wilcox, V. Akçelik, J. Hill, B. van Bloemen Waanders, and O. Ghattas, Fast algorithms for Bayesian uncertainty quantification in large-scale linear inverse problems based on low-rank partial Hessian approximations, SIAM J. Sci. Comput., 33 (2011), pp. 407--432, https://doi.org/10.1137/090780717.
23.
C. M. Fleeter, G. Geraci, D. E. Schiavazzi, A. M. Kahn, and A. L. Marsden, Multilevel and multifidelity uncertainty quantification for cardiovascular hemodynamics, Comput. Methods Appl. Mech. Engrg., 365 (2020), art. 113030, https://doi.org/10.1016/j.cma.2020.113030.
24.
D. Foreman-Mackey, D. W. Hogg, D. Lang, and J. Goodman, emcee: The MCMC hammer, Publ. Astronom. Soc. Pacific, 125 (2013), pp. 306--312.
25.
C. Fox and G. Nicholls, Sampling conductivity images via MCMC, in Proceedings of the Leeds Annual Statistical Research Workshop (LASR), K. Mardia, C. Gill, and R. Aykroyd, eds., 1997, pp. 91--100.
26.
N. Friedman, M. Linial, I. Nachman, and D. Pe'er, Using Bayesian networks to analyze expression data, J. Comput. Biol., 7 (2000), pp. 601--620.
27.
A. Garbuno-Inigo, N. Nüsken, and S. Reich, Affine invariant interacting Langevin dynamics for Bayesian inference, SIAM J. Appl. Dynam. Syst., 19 (2020), pp. 1633--1658, https://doi.org/10.1137/19M1304891.
28.
A. Gelman, W. R. Gilks, and G. O. Roberts, Weak convergence and optimal scaling of random walk Metropolis algorithms, Ann. Appl. Probab., 7 (1997), pp. 110--120.
29.
C. J. Geyer, Practical Markov chain Monte Carlo, Statist. Sci., 7 (1992), pp. 473--483.
30.
J. Goodman and J. Weare, Ensemble samplers with affine invariance, Commun. Appl. Math. Comput. Sci., 5 (2010), pp. 65--80.
31.
H. Haario, M. Laine, A. Mira, and E. Saksman, DRAM: Efficient adaptive MCMC, Statist. Comput., 16 (2006), pp. 339--354.
32.
H. Haario, E. Saksman, and J. Tamminen, An adaptive Metropolis algorithm, Bernoulli, 7 (2001), pp. 223--242, https://doi.org/bj/1080222083.
33.
W. K. Hastings, Monte Carlo sampling methods using Markov chains and their applications, Biometrika, 57 (1970), pp. 97--109.
34.
Y. Jiang and A. D. Woodbury, A full-Bayesian approach to the inverse problem for steady-state groundwater flow and heat transport, Geophys. J. Internat., 167 (2006), pp. 1501--1512, https://doi.org/10.1111/j.1365-246x.2006.03145.x.
35.
G. L. Jones, On the Markov chain central limit theorem, Probab. Surv., 1 (2004), pp. 299--320.
36.
J. Kaipio and E. Somersalo, Statistical and Computational Inverse Problems, Springer-Verlag, 2005, https://doi.org/10.1007/b138659.
37.
J. Martin, L. C. Wilcox, C. Burstedde, and O. Ghattas, A stochastic Newton MCMC method for large-scale statistical inverse problems with application to seismic inversion, SIAM J. Sci. Comput., 34 (2012), pp. A1460--A1487, https://doi.org/10.1137/110845598.
38.
D. McDougall, N. Malaya, and R. D. Moser, The parallel C++ statistical library for Bayesian inference: QUESO, in Handbook of Uncertainty Quantification, R. Ghanem, D. Higdon, and H. Owhadi, eds., Springer, Cham, 2017, pp. 1829--1865, https://doi.org/10.1007/978-3-319-12385-1_57.
39.
R. Neal, MCMC using Hamiltonian dynamics, in Handbook of Markov Chain Monte Carlo, S. Brooks, A. Gelman, G. L. Jones, and X.-L. Meng, eds., CRC Press, 2011, pp. 113--162.
40.
R. M. Neal, Bayesian learning via stochastic dynamics, in Advances in Neural Information Processing Systems, S. Hanson, J. Cowan, and C. Giles, eds., 1992, pp. 475--482.
41.
M. Parno, A. Davis, P. Conrad, and Y. M. Marzouk, MIT Uncertainty Quantification (MUQ) Library, http://muq.mit.edu/, 2021.
42.
B. Peherstorfer, K. Willcox, and M. Gunzburger, Survey of multifidelity methods in uncertainty propagation, inference, and optimization, SIAM Rev., 60 (2018), pp. 550--591, https://doi.org/10.1137/16m1082469.
43.
N. Petra, J. Martin, G. Stadler, and O. Ghattas, A computational framework for infinite-dimensional Bayesian inverse problems, Part II: Stochastic Newton MCMC with application to ice sheet flow inverse problems, SIAM J. Sci. Comput., 36 (2014), pp. A1525--A1555, https://doi.org/10.1137/130934805.
44.
G. O. Roberts and J. S. Rosenthal, Examples of adaptive MCMC, J. Comput. Graphic. Statist., 18 (2009), pp. 349--367.
45.
G. O. Roberts and R. L. Tweedie, Exponential convergence of Langevin distributions and their discrete approximations, Bernoulli, 2 (1996), pp. 341--363.
46.
J. Seo, C. Fleeter, A. M. Kahn, A. L. Marsden, and D. E. Schiavazzi, Multi-fidelity estimators for coronary artery circulation models under clinically-informed data uncertainty, Int. J. Uncertain. Quantif., 10 (2020), pp. 449--466, https://doi.org/10.1615/int.j.uncertaintyquantification.2020033068.
47.
C. Sherlock and G. O. Roberts, Optimal scaling of the random walk Metropolis on elliptically symmetric unimodal targets, Bernoulli, 15 (2009), pp. 774--798, https://doi.org/10.3150/08-bej176.
48.
A. Sokal, Monte Carlo methods in statistical mechanics: Foundations and new algorithms, in Functional Integration, C. DeWitt-Morette, P. Cartier, and A. Folacci, eds., Springer, 1997, pp. 131--192.
49.
A. M. Stuart, Inverse problems: A Bayesian perspective, Acta Numer., 19 (2010), pp. 451--559, https://doi.org/10.1017/s0962492910000061.
50.
A. Tarantola, Inverse Problem Theory, Elsevier, Amsterdam, New York, 1987.
51.
A. Tarantola, Inverse Problem Theory and Methods for Model Parameter Estimation, SIAM, 2005, https://doi.org/10.1137/1.9780898717921.
52.
C. J. F. ter Braak, A Markov chain Monte Carlo version of the genetic algorithm differential evolution: Easy Bayesian computing for real parameter spaces, Statist. Comput., 16 (2006), pp. 239--249, https://doi.org/10.1007/s11222-006-8769-1.
53.
C. J. F. ter Braak and J. A. Vrugt, Differential Evolution Markov Chain with snooker updater and fewer chains, Statist. Comput., 18 (2008), pp. 435--446, https://doi.org/10.1007/s11222-008-9104-9.
54.
L. Tierney and A. Mira, Some adaptive Monte Carlo methods for Bayesian inference, Statist. Medicine, 18 (1999), pp. 2507--2515.
55.
P. Vanetti, Piecewise-Deterministic Markov Chain Monte Carlo, Ph.D. thesis, University of Oxford, 2019.
56.
U. Villa, N. Petra, and O. Ghattas, hIPPYlib: An extensible software framework for large-scale inverse problems governed by PDEs. Part I: Deterministic inversion and linearized Bayesian inference, ACM Trans. Math. Softw., 47 (2021), pp. 1--34, https://doi.org/10.1145/3428447.
57.
J. A. Vrugt, C. J. F. ter Braak, C. G. H. Diks, B. A. Robinson, J. M. Hyman, and D. Higdon, Accelerating Markov Chain Monte Carlo simulation by differential evolution with self-adaptive randomized subspace sampling, Internat. J. Nonlinear Sci. Numer. Simul., 10 (2009), pp. 273--290, https://doi.org/10.1515/ijnsns.2009.10.3.273.
58.
D. Watzenig and C. Fox, A review of statistical modelling and inference for electrical capacitance tomography, Measurement Sci. Technol., 20 (2009), art. 052002, https://doi.org/10.1088/0957-0233/20/5/052002.
59.
K. Wolter, Introduction to Variance Estimation, Springer Science & Business Media, 2007.
60.
J. Worthen, G. Stadler, N. Petra, M. Gurnis, and O. Ghattas, Towards adjoint-based inversion for rheological parameters in nonlinear viscous mantle flow, Phys. Earth Planetary Interiors, 234 (2014), pp. 23--34, https://doi.org/10.1016/j.pepi.2014.06.006.

Information & Authors

Information

Published In

cover image SIAM Review
SIAM Review
Pages: 1074 - 1105
ISSN (online): 1095-7200

History

Submitted: 18 February 2021
Accepted: 9 January 2023
Published online: 7 November 2023

Keywords

  1. Bayesian inference
  2. inverse problems
  3. partial differential equations
  4. Markov chain Monte Carlo
  5. benchmarking

MSC codes

  1. 65N21
  2. 35R30
  3. 74G75

Authors

Affiliations

Funding Information

National Science Foundation https://doi.org/10.13039/100000001 : OAC-1835673, DMS-1821210, EAR-1925595, EAR-1550901
National Science Foundation https://doi.org/10.13039/100000001 : DMS-1818726, DMS-2111277

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

There are no citations for this item

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media

The SIAM Publications Library now uses SIAM Single Sign-On for individuals. If you do not have existing SIAM credentials, create your SIAM account https://my.siam.org.