Abstract.

For artificial deep neural networks, we prove expression rates for analytic functions \(f:\mathbb{R}^d\to \mathbb{R}\) in the norm of \(L^2(\mathbb{R}^d,\gamma_d)\) where \(d\in \mathbb{N}\cup \{ \infty \}\) . Here \(\gamma_d\) denotes the Gaussian product probability measure on \(\mathbb{R}^d\) . We consider in particular \(\mathrm{ReLU}\) and \(\mathrm{ReLU}^k\) activations for integer \(k\geq 2\) . For \(d\in \mathbb{N}\) , we show exponential convergence rates in \(L^2(\mathbb{R}^d,\gamma_d)\) . In case \(d=\infty\) , under suitable smoothness and sparsity assumptions on \(f:\mathbb{R}^{\mathbb{N}}\to \mathbb{R}\) , with \(\gamma_\infty\) denoting an infinite (Gaussian) product measure on \((\mathbb{R}^{\mathbb{N}}, \mathcal{B}(\mathbb{R}^{\mathbb{N}}))\) , we prove dimension-independent expression rate bounds in the norm of \(L^2(\mathbb{R}^{\mathbb{N}},\gamma_\infty )\) . The rates only depend on quantified holomorphy of (an analytic continuation of) the map \(f\) to a product of strips in \(\mathbb{C}^d\) (in \(\mathbb{C}^{\mathbb{N}}\) for \(d=\infty\) , respectively). As an application, we prove expression rate bounds of deep \(\mathrm{ReLU}\) -NNs for response surfaces of elliptic PDEs with log-Gaussian random field inputs.

Keywords

  1. neural networks
  2. analytic functions
  3. expression rates
  4. Gaussian measures

MSC codes

  1. 41A25
  2. 41A46
  3. 68Q32
  4. 33C45

Get full access to this article

View all available purchase options and get full access to this article.

Acknowledgment.

This work was performed in part in the program “Mathematics of Deep Learning” (MDL) at the Isaac Newton Institute, Cambridge, UK, from July–December, 2021. Fertile exchanges and stimulating workshops are warmly acknowledged.

References

1.
I. Babuška, F. Nobile, and R. Tempone, A stochastic collocation method for elliptic partial differential equations with random input data, SIAM J. Numer. Anal., 45 (2007), pp. 1005–1034, https://doi.org/10.1137/050645142.
2.
M. Bachmayr, A. Cohen, R. DeVore, and G. Migliorati, Sparse polynomial approximation of parametric elliptic PDEs. Part II: Lognormal coefficients, ESAIM Math. Model. Numer. Anal., 51 (2017), pp. 341–363, https://doi.org/10.1051/m2an/2016051.
3.
V. I. Bogachev, Gaussian Measures, Math. Surveys Monogr. 62, American Mathematical Society, Providence, RI, 1998, https://doi.org/10.1090/surv/062.
4.
R. H. Cameron and W. T. Martin, The orthogonal development of non-linear functionals in series of Fourier-Hermite functionals, Ann. of Math. (2), 48 (1947), pp. 385–392, https://doi.org/10.2307/1969178.
5.
D. Dung, V. K. Nguyen, C. Schwab, and J. Zech, Analyticity and Sparsity in Uncertainty Quantification for PDEs with Gaussian Random Field Inputs, preprint, https://arxiv.org/abs/arXiv:2201.01912, 2022.
6.
D. Elbrächter, D. Perekrestenko, P. Grohs, and H. Bölcskei, Deep neural network approximation theory, IEEE Trans. Inform. Theory, 67 (2021), pp. 2581–2623, https://doi.org/10.1109/TIT.2021.3062161.
7.
R. Gribonval, G. Kutyniok, M. Nielsen, and F. Voigtlaender, Approximation spaces of deep neural networks, Constr. Approx., 55 (2022), pp. 259–367, https://doi.org/10.1007/s00365-021-09543-4.
8.
P. Grohs and G. Kutyniok, Parabolic molecules, Found. Comput. Math., 14 (2014), pp. 299–337, https://doi.org/10.1007/s10208-013-9170-z.
9.
E. Hille, Contributions to the theory of Hermitian series. II. The representation problem, Trans. Amer. Math. Soc., 47 (1940), pp. 80–94, https://doi.org/10.2307/1990002.
10.
V. H. Hoang and C. Schwab, N -term Wiener chaos approximation rate for elliptic PDEs with lognormal Gaussian random inputs, Math. Models Methods Appl. Sci., 24 (2014), pp. 797–826, https://doi.org/10.1142/S0218202513500681.
11.
J. Indritz, An inequality for Hermite polynomials, Proc. Amer. Math. Soc., 12 (1961), pp. 981–983, https://doi.org/10.2307/2034406.
12.
S. Janson, Gaussian Hilbert Spaces, Cambridge Tracts in Math. 129, Cambridge University Press, Cambridge, UK, 1997, https://doi.org/10.1017/CBO9780511526169.
13.
C. Jerez-Hanckes, C. Schwab, and J. Zech, Electromagnetic wave scattering by random surfaces: Shape holomorphy, Math. Models Methods Appl. Sci., 27 (2017), pp. 2229–2259.
14.
A. Lang and C. Schwab, Isotropic Gaussian random fields on the sphere: Regularity, fast simulation, and stochastic partial differential equations, Ann. Appl. Probab., 25 (2015), pp. 3047–3094, https://doi.org/10.1214/14-AAP1067.
15.
B. Li, S. Tang, and H. Yu, PowerNet: Efficient representations of polynomials and smooth functions by deep neural networks with rectified power units, J. Math. Study, 53 (2020), pp. 159–191, https://doi.org/10.4208/jms.v53n2.20.03.
16.
M. A. Lifshits, Gaussian Random Functions, Math. Appl. 322, Kluwer Academic Publishers, Dordrecht, The Netherlands, 1995, https://doi.org/10.1007/978-94-015-8474-6.
17.
M. Longo, J. A. A. Opschoor, N. Disch, C. Schwab, and J. Zech, De Rham Compatible Deep Neural Networks, Tech. Report 2022-03, Seminar for Applied Mathematics, ETH Zürich, Zürich, Switzerland, 2022, https://www.sam.math.ethz.ch/sam_reports/reports_final/reports2022/2022-03.pdf.
18.
H. Luschgy and G. Pagès, Expansions for Gaussian processes and Parseval frames, Electron. J. Probab., 14 (2009), pp. 1198–1221, https://doi.org/10.1214/EJP.v14-649.
19.
L. Mattner, Complex differentiation under the integral, Nieuw Arch. Wiskd., 2 (2001), pp. 32–35.
20.
J. A. A. Opschoor, P. C. Petersen, and C. Schwab, Deep ReLU networks and high-order finite element methods, Anal. Appl. (Singap.), 18 (2020), pp. 715–770, https://doi.org/10.1142/S0219530519410136.
21.
J. A. A. Opschoor, C. Schwab, and J. Zech, Exponential ReLU DNN expression of holomorphic maps in high dimension, Constr. Approx., 55 (2022), pp. 537–582, https://doi.org/10.1007/s00365-021-09542-5.
22.
P. Petersen and F. Voigtlaender, Optimal approximation of piecewise smooth functions using deep ReLU neural networks, Neural Netw., 108 (2018), pp. 296–330, https://doi.org/10.1016/j.neunet.2018.08.019.
23.
Ch. Schwab and J. Zech, Deep learning in high dimension: Neural network expression rates for generalized polynomial chaos expansions in UQ, Anal. Appl. (Singap.), 17 (2019), pp. 19–55, https://doi.org/10.1142/S0219530518500203.
24.
A. M. Stuart and A. L. Teckentrup, Posterior consistency for Gaussian process approximations of Bayesian posterior distributions, Math. Comp., 87 (2018), pp. 721–753, https://doi.org/10.1090/mcom/3244.
25.
G. Szegő, Orthogonal Polynomials, Vol. XXIII, 4th ed., American Mathematical Society, Providence, RI, 1975.
26.
N. Wiener, The homogeneous chaos, Amer. J. Math., 60 (1938), pp. 897–936, https://doi.org/10.2307/2371268.
27.
D. Yarotsky, Error bounds for approximations with deep ReLU networks, Neural Netw., 94 (2017), pp. 103–114, https://doi.org/10.1016/j.neunet.2017.07.002.
28.
D. Yarotsky, Optimal approximation of continuous functions by very deep ReLU networks, in Proceedings of the 31st Conference on Learning Theory, Proc. Mach. Learn. Res. 75, S. Bubeck, V. Perchet, and P. Rigollet, eds., Cambridge, MA, 2018, pp. 639–649, https://proceedings.mlr.press/v75/yarotsky18a.html.
29.
J. Zech and C. Schwab, Convergence rates of high dimensional Smolyak quadrature, ESAIM Math. Model. Numer. Anal., 54 (2020), pp. 1259–1307, https://doi.org/10.1051/m2an/2020003.

Information & Authors

Information

Published In

cover image SIAM/ASA Journal on Uncertainty Quantification
SIAM/ASA Journal on Uncertainty Quantification
Pages: 199 - 234
ISSN (online): 2166-2525

History

Submitted: 1 December 2021
Accepted: 6 September 2022
Published online: 1 March 2023

Keywords

  1. neural networks
  2. analytic functions
  3. expression rates
  4. Gaussian measures

MSC codes

  1. 41A25
  2. 41A46
  3. 68Q32
  4. 33C45

Authors

Affiliations

Christoph Schwab
Seminar for Applied Mathematics, ETH Zürich, 8092 Zürich, Switzerland.
Interdisziplinäres Zentrum für wissenschaftliches Rechnen, Universität Heidelberg, 69120 Heidelberg, Germany.

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

View Options

View options

PDF

View PDF

Full Text

View Full Text

Figures

Tables

Media

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media