Scam Alert

Scam Alert

Please verify and be careful about any phishing and scam attempts from external companies.
All conferences and research programs at IML are free of charge.
We will not ask you for any payments regarding your accommodation or travel arrangements

Andrea Barth: Generalized Bayesian inversion and its application to S(P)DEs

Date: 2025-06-05

Time: 17:30 - 18:00

Speaker
Andrea Barth, Univ. of Stuttgart

Abstract
In numerous applications necessitating the estimation of a quantity of interest, direct observation is not a viable method. Consequently, an alternative quantity is observed that is associated with the quantity of interest. The problem of inferring the quantity of interest from this observation gives rise to the field of inverse problems. In a more general sense, inverse problems can be understood as data-driven model fitting problems, thus arising in numerous applications in the sciences, engineering, and finance. Inverse problems are formally defined as the inversion of well-posed problems, which are often referred to as forward problems. However, it has been observed that the process of inversion of a well-posed problem frequently results in an ill-posed problem. Given the ill-posedness of inverse problems, it is not possible to expect an exact reconstruction of the quantity of interest. Therefore, methods for quantifying the uncertainty associated with the reconstruction are required. To address this issue, Bayesian methods construct a probability distribution that allows for the reconstruction of the quantity of interest and the quantification of the associated uncertainties. In this talk we consider a generalization of a Bayesian inverse problem to allow for infinite-dimensional input and output spaces. Additionally the forward map is considered stochastic, lifting the standard assumption of additive, independent noise. Thus, this formulation allows the consideration of problems based on stochastic processes and random fields as well as stochastic and random (partial) differential equations. The definition of a consistent finite-dimensional approximation is the essential tool. This is achieved by the definition of a projection, that is associated to a filtration which is interpreted as the information gain. Convergence of the approximation in the number of observation points is proved and the existence of moments can be guaranteed. We close the talk with examples encompassing the reconstruction of coefficient functions of a stochastic (partial) differential equation from path observations and finding the parameters of a covariance kernel of a Gaussian random field from a realization.