Repository: Freie Universität Berlin, Math Department

Sparsity in Bayesian inversion of parametric operator equations

Schillings, Claudia and Schwab, Christoph (2014) Sparsity in Bayesian inversion of parametric operator equations. Inverse Problems, 30 (6).

Full text not available from this repository.

Abstract

We establish posterior sparsity in Bayesian inversion for systems governed by operator equations with distributed parameter uncertainty subject to noisy observation data δ. We generalize the results and algorithms introduced in C Schillings and C Schwab (2013 Inverse Problems 29 065011) for the particular case of scalar diffusion problems with random coefficients to broad classes of forward problems, including general elliptic and parabolic operators with uncertain coefficients, and in random domains. For countably parametric, deterministic representations of uncertain parameters in the forward problem, which belong to a specified sparsity class, we quantify analytic regularity of the likewise countably parametric, deterministic Bayesian posterior density with respect to a uniform prior on the uncertain parameter sequences and prove that the parametric, deterministic density of the Bayesian posterior belongs to the same sparsity class. Generalizing C Schillings and C Schwab (2013 Inverse Problems 29 065011) and C Schwab and A M Stuart (2012 Inverse Problems 28 045003) the forward problems are converted to countably parametric, deterministic operator equations. Computational Bayesian inversion amounts to numerically evaluating expectations of quantities of interest (QoIs) under the Bayesian posterior, conditional on noisy observation data. Our results imply, on the one hand, sparsity of Legendre (generalized) polynomial chaos expansions of the density of the Bayesian posterior with respect to uniform prior and, on the other hand, convergence rates for data-adaptive Smolyak integration algorithms for computational Bayesian estimation, which are independent of the dimension of the parameter space. We prove, mathematically and computationally, that for uncertain inputs with sufficient sparsity convergence rates are, in particular, superior to Markov chain Monte-Carlo sampling of the posterior, in terms of the number N of instances of the parametric forward problem to be solved.

Item Type:Article
Subjects:Mathematical and Computer Sciences > Mathematics > Applied Mathematics
Divisions:Department of Mathematics and Computer Science > Institute of Mathematics > Deterministic and Stochastic PDEs Group
ID Code:3004
Deposited By: Ulrike Eickers
Deposited On:08 Jun 2023 09:28
Last Modified:08 Jun 2023 09:28

Repository Staff Only: item control page