Repository: Freie Universität Berlin, Math Department

Learning linear operators: Infinite-dimensional regression as a well-behaved non-compact inverse problem

Mollenhauer, Mattes and Mücke, Nicole and Sullivan, T.J. (2022) Learning linear operators: Infinite-dimensional regression as a well-behaved non-compact inverse problem. arXiv . (Submitted)

Full text not available from this repository.

Official URL: https://doi.org/10.48550/arXiv.2211.08875

Abstract

We consider the problem of learning a linear operator θ between two Hilbert spaces from empirical observations, which we interpret as least squares regression in infinite dimensions. We show that this goal can be reformulated as an inverse problem for θ with the undesirable feature that its forward operator is generally non-compact (even if θ is assumed to be compact or of p-Schatten class). However, we prove that, in terms of spectral properties and regularisation theory, this inverse problem is equivalent to the known compact inverse problem associated with scalar response regression. Our framework allows for the elegant derivation of dimension-free rates for generic learning algorithms under Hölder-type source conditions. The proofs rely on the combination of techniques from kernel regression with recent results on concentration of measure for sub-exponential Hilbertian random variables. The obtained rates hold for a variety of practically-relevant scenarios in functional regression as well as nonlinear regression with operator-valued kernels and match those of classical kernel regression with scalar response.

Item Type:Article
Divisions:Department of Mathematics and Computer Science > Institute of Mathematics > Deterministic and Stochastic PDEs Group
ID Code:3095
Deposited By: Ulrike Eickers
Deposited On:19 Feb 2024 13:55
Last Modified:19 Feb 2024 13:55

Repository Staff Only: item control page