Abstract

As an emerging paradigm in scientific machine learning, neural operators aim to learn operators, via neural networks, that map between infinite-dimensional function spaces. Several neural operators have been recently developed. However, all the existing neural operators are only designed to learn operators defined on a single Banach space; i.e., the input of the operator is a single function. Here, for the first time, we study the operator regression via neural networks for multiple-input operators defined on the product of Banach spaces. We first prove a universal approximation theorem of continuous multiple-input operators. We also provide a detailed theoretical analysis including the approximation error, which provides guidance for the design of the network architecture. Based on our theory and a low-rank approximation, we propose a novel neural operator, MIONet, to learn multiple-input operators. MIONet consists of several branch nets for encoding the input functions and a trunk net for encoding the domain of the output function. We demonstrate that MIONet can learn solution operators involving systems governed by ordinary and partial differential equations. In our computational examples, we also show that we can endow MIONet with prior knowledge of the underlying system, such as linearity and periodicity, to further improve accuracy.

Keywords

  1. operator regression
  2. multiple-input operators
  3. tensor product
  4. universal approximation theorem
  5. neural networks
  6. MIONet
  7. scientific machine learning

MSC codes

  1. 47-08
  2. 47H99
  3. 65D15
  4. 68Q32
  5. 68T07

Get full access to this article

View all available purchase options and get full access to this article.

References

1.
K. Bhattacharya, B. Hosseini, N. B. Kovachki, and A. M. Stuart, Model Reduction and Neural Networks for Parametric PDEs, preprint, https://arxiv.org/abs/2005.03180, 2020.
2.
S. Cai, Z. Wang, L. Lu, T. A. Zaki, and G. E. Karniadakis, DeepM&Mnet: Inferring the electroconvection multiphysics fields based on operator approximation by neural networks, J. Comput. Phys., 436 (2021), 110296.
3.
T. Chen and H. Chen, Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems, IEEE Trans. Neural Networks, 6 (1995), pp. 911--917.
4.
Y. Chen, L. Lu, G. E. Karniadakis, and L. Dal Negro, Physics-informed neural networks for inverse problems in nano-optics and metamaterials, Optics Express, 28 (2020), pp. 11618--11633.
5.
M. Daneker, Z. Zhang, G. E. Karniadakis, and L. Lu, Systems Biology: Identifiability Analysis and Parameter Identification via Systems-Biology Informed Neural Networks, preprint, https://arxiv.org/abs/2202.01723, 2022.
6.
B. Deng, Y. Shin, L. Lu, Z. Zhang, and G. E. Karniadakis, Convergence Rate of DeepONets for Learning Operators Arising from Advection-Diffusion Equations, preprint, https://arxiv.org/abs/2102.10621, 2021.
7.
P. C. Di Leoni, L. Lu, C. Meneveau, G. Karniadakis, and T. A. Zaki, DeepONet Prediction of Linear Instability Waves in High-Speed Boundary Layers, preprint, https://arxiv.org/abs/2105.08697, 2021.
8.
W. E and B. Yu, The deep Ritz method: A deep learning-based numerical algorithm for solving variational problems, Commun. Math. Stat., 6 (2018), pp. 1--12.
9.
M. Fabian, P. Habala, P. Hájek, V. Montesinos, and V. Zizler, Banach Space Theory: The Basis for Linear and Nonlinear Analysis, Springer, New York, 2011.
10.
S. Goswami, M. Yin, Y. Yu, and G. E. Karniadakis, A physics-informed variational DeepONet for predicting crack path in quasi-brittle materials, Comput. Methods Appl. Mech. Engrg., 391 (2022), 114587.
11.
H. Johan, Tensor rank is NP-complete, J. Algorithms, 4 (1990), pp. 644--654.
12.
G. E. Karniadakis, I. G. Kevrekidis, L. Lu, P. Perdikaris, S. Wang, and L. Yang, Physics-informed machine learning, Nature Rev. Phys., 3 (2021), pp. 422--440.
13.
D. P. Kingma and J. Ba, Adam: A method for stochastic optimization, in Proceedings of the 3rd International Conference on Learning Representations (ICLR), San Diego, CA, Conference Track Proceedings, 2015.
14.
G. Kissas, J. Seidman, L. F. Guilhoto, V. M. Preciado, G. J. Pappas, and P. Perdikaris, Learning Operators with Coupled Attention, preprint, https://arxiv.org/abs/2201.01032, 2022.
15.
G. Kissas, Y. Yang, E. Hwuang, W. R. Witschey, J. A. Detre, and P. Perdikaris, Machine learning in cardiovascular flows modeling: Predicting arterial blood pressure from non-invasive $4$D flow MRI data using physics-informed neural networks, Comput. Methods Appl. Mech. Engrg., 358 (2020), 112623.
16.
T. G. Kolda and B. W. Bader, Tensor decompositions and applications, SIAM Rev., 51 (2009), pp. 455--500, https://doi.org/10.1137/07070111X.
17.
N. Kovachki, S. Lanthaler, and S. Mishra, On universal approximation and error bounds for Fourier neural operators, J. Mach. Learn. Res., 22 (2021), 290.
18.
N. Kovachki, Z. Li, B. Liu, K. Azizzadenesheli, K. Bhattacharya, A. Stuart, and A. Anandkumar, Neural Operator: Learning Maps between Function Spaces, preprint, https://arXiv.org/abs/2108.08481, 2021.
19.
J. B. Kruskal, Rank, decomposition, and uniqueness for $3$-way and $n$-way arrays, in Multiway Data Analysis, Papers from the International Meeting on the Analysis of Multiway Data Matrices held in Rome, 1988, R. Coppi and S. Bolasco, eds., North-Holland, Amsterdam, 1989, pp. 7--18.
20.
S. Lanthaler, S. Mishra, and G. E. Karniadakis, Error Estimates for DeepONets: A Deep Learning Framework in Infinite Dimensions, preprint, https://arxiv.org/abs/2102.09618, 2021.
21.
Z. Li, N. Kovachki, K. Azizzadenesheli, B. Liu, K. Bhattacharya, A. Stuart, and A. Anandkumar, Fourier Neural Operator for Parametric Partial Differential Equations, preprint, https://arxiv.org/abs/2010.08895, 2020.
22.
Z. Li, N. Kovachki, K. Azizzadenesheli, B. Liu, K. Bhattacharya, A. Stuart, and A. Anandkumar, Neural Operator: Graph Kernel Network for Partial Differential Equations, preprint, https://arxiv.org/abs/2003.03485, 2020.
23.
C. Lin, Z. Li, L. Lu, S. Cai, M. Maxey, and G. E. Karniadakis, Operator learning for predicting multiscale bubble growth dynamics, J. Chem. Phys., 154 (2021), 104118.
24.
C. Lin, M. Maxey, Z. Li, and G. E. Karniadakis, A seamless multiscale operator neural network for inferring bubble dynamics, J. Fluid Mech., 929 (2021), A18.
25.
G. Lin, C. Moya, and Z. Zhang, Accelerated Replica Exchange Stochastic Gradient Langevin Diffusion Enhanced Bayesian DeepONet for Solving Noisy Parametric PDEs, preprint, https://arxiv.org/abs/2111.02484, 2021.
26.
L. Liu and W. Cai, Multiscale DeepONet for Nonlinear Operators in Oscillatory Function Spaces for Building Seismic Wave Responses, preprint, https://arxiv.org/abs/2111.04860, 2021.
27.
L. Lu, P. Jin, and G. E. Karniadakis, DeepONet: Learning Nonlinear Operators for Identifying Differential Equations Based on the Universal Approximation Theorem of Operators, preprint, https://arxiv.org/abs/1910.03193, 2019.
28.
L. Lu, P. Jin, G. Pang, Z. Zhang, and G. E. Karniadakis, Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators, Nature Mach. Intell., 3 (2021), pp. 218--229.
29.
L. Lu, X. Meng, S. Cai, Z. Mao, S. Goswami, Z. Zhang, and G. E. Karniadakis, A Comprehensive and Fair Comparison of Two Neural Operators (with Practical Extensions) Based on FAIR Data, preprint, https://arxiv.org/abs/2111.05512, 2021.
30.
L. Lu, X. Meng, Z. Mao, and G. E. Karniadakis, DeepXDE: A deep learning library for solving differential equations, SIAM Rev., 63 (2021), pp. 208--228, https://doi.org/10.1137/19M1274067.
31.
L. Lu, R. Pestourie, W. Yao, Z. Wang, F. Verdugo, and S. G. Johnson, Physics-informed neural networks with hard constraints for inverse design, SIAM J. Sci. Comput., 43 (2021), pp. B1105--B1132, https://doi.org/10.1137/21M1397908.
32.
Z. Mao, L. Lu, O. Marxen, T. A. Zaki, and G. E. Karniadakis, DeepM&Mnet for hypersonics: Predicting the coupled flow and finite-rate chemistry behind a normal shock using neural-network approximation of operators, J. Comput. Phys., 447 (2021), 110698.
33.
C. Marcati and C. Schwab, Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations, preprint, https://arxiv.org/abs/2112.08125, 2021.
34.
N. H. Nelsen and A. M. Stuart, The random feature model for input-output maps between Banach spaces, SIAM J. Sci. Comput., 43 (2021), pp. A3212--A3243, https://doi.org/10.1137/20M133957X.
35.
J. D. Osorio, Z. Wang, G. Karniadakis, S. Cai, C. Chryssostomidis, M. Panwar, and R. Hovsapian, Forecasting solar-thermal systems performance under transient operation using a data-driven machine learning approach based on the deep operator network architecture, Energy Conversion and Management, 252 (2022), 115063.
36.
G. Pang, L. Lu, and G. E. Karniadakis, fPINNs: Fractional physics-informed neural networks, SIAM J. Sci. Comput., 41 (2019), pp. A2603--A2626, https://doi.org/10.1137/18M1229845.
37.
R. G. Patel, N. A. Trask, M. A. Wood, and E. C. Cyr, A physics-informed operator regression framework for extracting data-driven continuum models, Comput. Methods Appl. Mech. Engrg., 373 (2021), 113500.
38.
M. Raissi, P. Perdikaris, and G. E. Karniadakis, Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations, J. Comput. Phys., 378 (2019), pp. 686--707.
39.
M. Raissi, A. Yazdani, and G. E. Karniadakis, Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations, Science, 367 (2020), pp. 1026--1030.
40.
R. A. Ryan, Introduction to Tensor Products of Banach Spaces, Springer Monogr. Math., Springer-Verlag London, Ltd., London, 2002.
41.
J. Sirignano and K. Spiliopoulos, DGM: A deep learning algorithm for solving partial differential equations, J. Comput. Phys., 375 (2018), pp. 1339--1364.
42.
N. Trask, R. G. Patel, B. J. Gross, and P. J. Atzberger, GMLS-Nets: A Framework for Learning from Unstructured Data, preprint, https://arxiv.org/abs/1909.05371, 2019.
43.
S. Wang, H. Wang, and P. Perdikaris, Learning the solution operator of parametric partial differential equations with physics-informed DeepONets, Sci. Adv., 7 (2021), eabi8605.
44.
A. Yazdani, L. Lu, M. Raissi, and G. E. Karniadakis, Systems biology informed deep learning for inferring parameters and hidden dynamics, PLoS Comput. Biol., 16 (2020), e1007575.
45.
M. Yin, E. Ban, B. V. Rego, E. Zhang, C. Cavinato, J. D. Humphrey, and G. Em Karniadakis, Simulating progressive intramural damage leading to aortic dissection using DeepONet: An operator--regression neural network, J. R. Soc. Interface, 19 (2022), 20210670.
46.
H. You, Y. Yu, M. D'Elia, T. Gao, and S. Silling, Nonlocal Kernel Network (NKN): A Stable and Resolution-Independent Deep Neural Network, preprint, https://arxiv.org/abs/2201.02217, 2022.
47.
J. Yu, L. Lu, X. Meng, and G. E. Karniadakis, Gradient-Enhanced Physics-Informed Neural Networks for Forward and Inverse PDE Problems, preprint, https://arxiv.org/abs/2111.02801, 2021.
48.
D. Zhang, L. Lu, L. Guo, and G. E. Karniadakis, Quantifying total uncertainty in physics-informed neural networks for solving forward and inverse stochastic problems, J. Comput. Phys., 397 (2019), 108850.

Information & Authors

Information

Published In

cover image SIAM Journal on Scientific Computing
SIAM Journal on Scientific Computing
Pages: A3490 - A3514
ISSN (online): 1095-7197

History

Submitted: 14 February 2022
Accepted: 13 July 2022
Published online: 7 November 2022

Keywords

  1. operator regression
  2. multiple-input operators
  3. tensor product
  4. universal approximation theorem
  5. neural networks
  6. MIONet
  7. scientific machine learning

MSC codes

  1. 47-08
  2. 47H99
  3. 65D15
  4. 68Q32
  5. 68T07

Authors

Affiliations

Funding Information

Postdoctoral Research Foundation of China https://doi.org/10.13039/501100010031 : 2022M710005

Funding Information

U.S. Department of Energy https://doi.org/10.13039/100000015 : DE-SC0022953

Metrics & Citations

Metrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited By

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share on social media

The SIAM Publications Library now uses SIAM Single Sign-On for individuals. If you do not have existing SIAM credentials, create your SIAM account https://my.siam.org.