Richter, Lorenz and Sallandt, Leon and Nüsken, Nikolas (2023) From continuous-time formulations to discretization schemes: tensor trains and robust regression for BSDEs and parabolic PDEs. Preprint . (Unpublished)
Full text not available from this repository.
Official URL: https://doi.org/10.48550/arXiv.2307.15496
Abstract
he numerical approximation of partial differential equations (PDEs) poses formidable challenges in high dimensions since classical grid-based methods suffer from the so-called curse of dimensionality. Recent attempts rely on a combination of Monte Carlo methods and variational formulations, using neural networks for function approximation. Extending previous work (Richter et al., 2021), we argue that tensor trains provide an appealing framework for parabolic PDEs: The combination of reformulations in terms of backward stochastic differential equations and regression-type methods holds the promise of leveraging latent low-rank structures, enabling both compression and efficient computation. Emphasizing a continuous-time viewpoint, we develop iterative schemes, which differ in terms of computational efficiency and robustness. We demonstrate both theoretically and numerically that our methods can achieve a favorable trade-off between accuracy and computational efficiency. While previous methods have been either accurate or fast, we have identified a novel numerical strategy that can often combine both of these aspects.
Item Type: | Article |
---|---|
Subjects: | Mathematical and Computer Sciences Mathematical and Computer Sciences > Mathematics Mathematical and Computer Sciences > Mathematics > Applied Mathematics |
Divisions: | Department of Mathematics and Computer Science > Institute of Mathematics |
ID Code: | 3046 |
Deposited By: | Jana Jerosch |
Deposited On: | 18 Jan 2024 11:44 |
Last Modified: | 18 Jan 2024 11:44 |
Repository Staff Only: item control page