Li, Z. and Meunier, D. and Mollenhauer, Mattes and Gretton, A. (2023) Optimal Rates for Regularized Conditional Mean Embedding Learning. Advances in Neural Information Processing Systems (NeurIPS), 36 . (Submitted)
Full text not available from this repository.
Official URL: https://doi.org/10.48550/arXiv.2208.01711
Abstract
We address the consistency of a kernel ridge regression estimate of the conditional mean embedding (CME), which is an embedding of the conditional distribution of Y given X into a target reproducing kernel Hilbert space HY. The CME allows us to take conditional expectations of target RKHS functions, and has been employed in nonparametric causal and Bayesian inference. We address the misspecified setting, where the target CME is in the space of Hilbert-Schmidt operators acting from an input interpolation space between HX and L2, to HY. This space of operators is shown to be isomorphic to a newly defined vector-valued interpolation space. Using this isomorphism, we derive a novel and adaptive statistical learning rate for the empirical CME estimator under the misspecified setting. Our analysis reveals that our rates match the optimal O(logn/n) rates without assuming HY to be finite dimensional. We further establish a lower bound on the learning rate, which shows that the obtained upper bound is optimal.
Item Type: | Article |
---|---|
Subjects: | Mathematical and Computer Sciences > Mathematics > Applied Mathematics |
Divisions: | Department of Mathematics and Computer Science > Institute of Mathematics > Deterministic and Stochastic PDEs Group |
ID Code: | 3098 |
Deposited By: | Ulrike Eickers |
Deposited On: | 19 Feb 2024 14:13 |
Last Modified: | 19 Feb 2024 14:13 |
Repository Staff Only: item control page