Repository: Freie Universität Berlin, Math Department

Scaling limits in computational Bayesian inversion

Schillings, Claudia and Schwab, Christoph (2016) Scaling limits in computational Bayesian inversion. ESAIM: Mathematical Modelling and Numerical Analysis, 50 (6). pp. 1825-1856. ISSN 2822-7840; eISSN = 2804-7214

Full text not available from this repository.

Official URL: https://doi.org/10.3929/ethz-a-010386179

Abstract

Computational Bayesian inversion of operator equations with distributed uncertain input parameters is based on an infinite-dimensional version of Bayes’ formula established in [31] and its numerical realization in [27, 28]. Based on the sparsity of the posterior density shown in [29], dimensionadaptive Smolyak quadratures afford higher convergence rates than MCMC in terms of the number M of solutions of the forward (parametric operator) equation [27, 28]. The error bounds and convergence rates obtained in [27, 28] are independent of the parameter dimension (in particular free from the curse of dimensionality) but depend on the (co)variance G > 0 of the additive, Gaussian observation noise as exp(bG−1) for some constant b > 0. It is proved that the Bayesian estimates admit asymptotic expansions as G ↓ 0. Sufficient (nondegeneracy) conditions for the existence of finite limits as G ↓ 0 are presented. For Gaussian priors, these limits are related to MAP estimators obtained from Tikhonov regularized least-squares functionals. Non-intrusive identification of concentration points and curvature information of the posterior density at these points by Quasi-Newton (QN) minimization of the Bayesian potential with SR1 updates from [7,14] is proposed. Two Bayesian estimation algorithms with robust in G performance are developed: first, dimension-adaptive Smolyak quadrature from [27, 28] combined with a novel, curvature-based reparametrization of the parametric Bayesian posterior density near the (assumed unique) global maximum of the posterior density and, second, extrapolation to the limit of vanishing observation noise variance. For either approach, we prove convergence with rates independent of the number of parameters as well as of the observation noise variance G. The generalized Richardson extrapolation to the limit G ↓ 0 due to A. Sidi [30] is justified by establishing asymptotic expansions wr. to G ↓ 0 of the Bayesian estimates. Numerical experiments are presented which indicate a performance independent of G on the curvature-rescaled, adaptive Smolyak algorithm.

Item Type:Article
Subjects:Mathematical and Computer Sciences > Mathematics > Applied Mathematics
Divisions:Department of Mathematics and Computer Science > Institute of Mathematics > Deterministic and Stochastic PDEs Group
ID Code:2998
Deposited By: Ulrike Eickers
Deposited On:06 Jun 2023 15:41
Last Modified:06 Jun 2023 15:41

Repository Staff Only: item control page