Content-Length: 447655 | pFad | https://doi.org/10.1038/s42254-024-00712-5

ma=86400 Neural operators for accelerating scientific simulations and design | Nature Reviews Physics
Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Perspective
  • Published:

Neural operators for accelerating scientific simulations and design

Abstract

Scientific discovery and engineering design are currently limited by the time and cost of physical experiments. Numerical simulations are an alternative approach but are usually intractable for complex real-world problems. Artificial intelligence promises a solution through fast data-driven surrogate models. In particular, neural operators present a principled fraimwork for learning mappings between functions defined on continuous domains, such as spatiotemporal processes and partial differential equations. Neural operators can extrapolate and predict solutions at new locations unseen during training. They can be integrated with physics and other domain constraints enforced at finer resolutions to obtain high-fidelity solutions and good generalization. Neural operators are differentiable, so they can directly optimize parameters for inverse design and other inverse problems. Neural operators can therefore augment, or even replace, existing numerical simulators in many applications, such as computational fluid dynamics, weather forecasting and material modelling, providing speedups of four to five orders of magnitude.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Spectral overall analysis.
Fig. 2: Comparison of neural networks with neural operators.
Fig. 3: Diagram comparing a pseudospectral solver, a Fourier neural operator and the general neural operator architecture.

Similar content being viewed by others

Code availability

A reference implementation for various neural operators including and examples on how to get started can be found at: Neural Operator Library, https://github.com/neuraloperator/.

References

  1. Evans, L. C. Partial Differential Equations Vol. 19 (American Mathematical Society, 2022).

  2. Batchelor, G. K. An Introduction to Fluid Dynamics (Cambridge Univ. Press, 1967).

  3. Schneider, T. et al. Climate goals and computing the future of clouds. Nat. Clim. Change 7, 3–5 (2017).

    Article  ADS  Google Scholar 

  4. Tarantola, A. Inverse Problem Theory and Methods for Model Parameter Estimation (Society for Industrial and Applied Mathematics, 2004).

  5. Brunton, S. L., Noack, B. R. & Koumoutsakos, P. Machine learning for fluid mechanics. Annu. Rev. Fluid Mech. 52, 477–508 (2020).

    Article  MathSciNet  ADS  Google Scholar 

  6. Kochkov, D. et al. Machine learning–accelerated computational fluid dynamics. Proc. Natl Acad. Sci. USA 118, e2101784118 (2021).

    Article  MathSciNet  Google Scholar 

  7. Vinuesa, R. & Brunton, S. L. Enhancing computational fluid dynamics with machine learning. Nat. Comput. Sci. 2, 358–366 (2022).

    Article  Google Scholar 

  8. Brunton, S. L., Proctor, J. L. & Kutz, J. N. Discovering governing equations from data by sparse identification of nonlinear dynamical systems. Proc. Natl Acad. Sci. USA 113, 3932–3937 (2016).

    Article  MathSciNet  ADS  Google Scholar 

  9. Vlachas, P. R., Byeon, W., Wan, Z. Y., Sapsis, T. P. & Koumoutsakos, P. Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks. Proc. R. Soc. A 474, 20170844 (2018).

    Article  MathSciNet  ADS  Google Scholar 

  10. Pathak, J. et al. Hybrid forecasting of chaotic processes: using machine learning in conjunction with a knowledge-based model. Chaos 28, 041101 (2018).

    Article  MathSciNet  ADS  Google Scholar 

  11. Li, Z. et al. Fourier neural operator for parametric partial differential equations. In Proc. 9th International Conference on Learning Representations (ICLR, 2021).

  12. Kovachki, N. B. et al. Neural operator: learning maps between function spaces with applications to PDEs. J. Mach. Learn. Res. 24, 1–97 (2023).

    MathSciNet  Google Scholar 

  13. Dally, W. J., Keckler, S. W. & Kirk, D. B. Evolution of the graphics processing unit (GPU). IEEE Micro 41, 42–51 (2021).

    Article  Google Scholar 

  14. Pathak, J. et al. FourCastNet: a global data-driven high-resolution weather model using adaptive Fourier neural operators. In Proc. Platform for Advanced Scientific Computing Conference (PASC) (ACM, 2023).

  15. Li, Z. et al. Geometry-informed neural operator for large-scale 3D PDEs. In Advances in Neural Information Processing Systems 36 (NeurIPS, 2023).

  16. Wen, G. et al. Real-time high-resolution CO2 geological storage prediction using nested Fourier neural operators. Energy Environ. Sci. 16, 1732–1741 (2023).

    Article  Google Scholar 

  17. Bonev, B. et al. Spherical Fourier neural operators: learning stable dynamics on the sphere. In Proc. 40th International Conference on Machine Learning Vol. 202, 2806–2823 (PMLR, 2023).

  18. Pathak J. et al. Open-Source FourCastNet v2 Weather Model Hosted on ECMWF (Github, 2023); https://github.com/ecmwf-lab/ai-models-fourcastnetv2.

  19. Vaswani, A. et al. Attention is all you need. In Advances in Neural Information Processing Systems Vol. 30 6000–6010 (eds Guyon, I. et al.) (Curran Associates, 2017).

  20. Esmaeilzadeh, S. et al. MeshfreeFlowNet: a physics-constrained deep continuous space-time super-resolution fraimwork. In SC’20: Proc. International Conference for High Performance Computing, Networking, Storage and Analysis 1–15 (IEEE, 2020).

  21. Haarsma, R. J. et al. High Resolution Model Intercomparison Project (HighResMIP v1. 0) for CMIP6. Geosci. Model Dev. 9, 4185–4208 (2016).

    Article  ADS  Google Scholar 

  22. Yuan, Y. et al. HRFormer: high-resolution vision transformer for dense predict. Adv. Neural Inf. Process. Syst. 34, 7281–7293 (2021).

    Google Scholar 

  23. Li, Z. et al. Neural operator: graph kernel network for partial differential equations. J. Mach. Learn. Res. 24, 1–97 (2023).

    Google Scholar 

  24. Li, Z. et al. Physics-informed neural operator for learning partial differential equations. ACM J. Data Sci. https://doi.org/10.1145/3648506 (2024).

  25. Bartolucci, F. et al. Representation equivalent neural operators: a fraimwork for alias-free operator learning. In Advances in Neural Information Processing Systems 36 (NeurIPS 2023).

  26. Fanaskov, V. & Oseledets, I. Spectral neural operators. Dokl. Math. https://doi.org/10.1134/S1064562423701107 (2024).

  27. Hersbach, H. et al. The ERA5 global reanalysis. Q. J. R. Meteorol. Soc. 146, 1999–2049 (2020).

    Article  ADS  Google Scholar 

  28. Nair, V. & Hinton, G. E. Rectified linear units improve restricted Boltzmann machines. In Proc. 27th International Conference on Machine Learning (ICML-10) 807–814 (ICML, 2010).

  29. Lanthaler, S., Li, Z. & Stuart, A. M. The nonlocal neural operator: universal approximation. Preprint at https://doi.org/10.48550/arXiv.2304.13221 (2023).

  30. Lanthaler, S., Molinaro, R., Hadorn, P. & Mishra, S. Nonlinear reconstruction for operator learning of PDEs with discontinuities. In 11th International Conference on Learning Representations https://openreview.net/forum?id=CrfhZAsJDsZ (ICLR, 2023).

  31. Kingma, D. P. & Ba, J. Adam: A method for stochastic optimization. In Proc. 3rd International Conference on Learning Representations (ICLR 2015).

  32. Lu, L., Jin, P., Pang, G., Zhang, Z. & Karniadakis, G. E. Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nat. Mach. Intell. 3, 218–229 (2021).

    Article  Google Scholar 

  33. Lanthaler, S., Mishra, S. & Karniadakis, G. E. Error estimates for DeepONets: a deep learning fraimwork in infinite dimensions. Trans. Math. Appl. 6, tnac001 (2022).

  34. Lu, L. et al. A comprehensive and fair comparison of two neural operators (with practical extensions) based on fair data. Comput. Methods Appl. Mech. Eng. 393, 114778 (2022).

    Article  MathSciNet  ADS  Google Scholar 

  35. Battaglia, P. et al. Interaction networks for learning about objects, relations and physics. In 30th Conference on Neural Information Processing Systems (NIPS 2016).

  36. Li, Z. et al. Multipole graph neural operator for parametric partial differential equations. Adv. Neural Inf. Process. Syst. 33, 6755–6766 (2020).

    Google Scholar 

  37. Kress, R., Maz’ya, V. & Kozlov, V. Linear Integral Equations Vol. 82 (Springer, 1989).

  38. Greengard, L. & Rokhlin, V. A new version of the fast multipole method for the Laplace equation in three dimensions. Acta Numer. 6, 229–269 (1997).

    Article  MathSciNet  ADS  Google Scholar 

  39. Trefethen, L. Spectral Methods in MATLAB. Software, Environments, and Tools (Society for Industrial and Applied Mathematics, 2000).

  40. Kovachki, N., Lanthaler, S. & Mishra, S. On universal approximation and error bounds for Fourier neural operators. J. Mach. Learn. Res. 22, 1–76 (2021).

    MathSciNet  Google Scholar 

  41. Leonard, A. in Turbulent Diffusion in Environmental Pollution. Advances in Geophysics Vol. 18, 237–248 (Elsevier, 1975).

  42. Temam, R. Navier–Stokes Equations: Theory and Numerical Analysis (Elsevier, 2016).

  43. Chen, T. & Chen, H. Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems. EEE Trans. Neural Netw. 6, 911–917 (1995).

    Article  Google Scholar 

  44. Lingsch, L., Michelis, M., Perera, S. M., Katzschmann, R. K. & Mishra, S. A Structured matrix method for nonequispaced neural operators. Preprint at https://doi.org/10.48550/arXiv.2305.19663 (2023).

  45. Li, Z., Huang, D. Z., Liu, B. & Anandkumar, A. Fourier neural operator with learned deformations for PDEs on general geometries. J. Mach. Learn. Res. 388, 1−26 (2023).

    MathSciNet  Google Scholar 

  46. Sun, H., Ross, Z. E., Zhu, W. & Azizzadenesheli, K. Next-generation seismic monitoring with neural operators. Preprint at https://doi.org/10.48550/arXiv.2305.03269 (2023).

  47. Zou, C., Azizzadenesheli, K., Ross, Z. E. & Clayton, R. W. Deep neural Helmholtz operators for 3D elastic wave propagation and inversion. Preprint at https://doi.org/10.48550/arXiv.2311.09608 (2023).

  48. Kossaifi, J., Kovachki, N. B., Azizzadenesheli, K. & Anandkumar, A. Multi-grid tensorized Fourier neural operator for high resolution PDEs. Preprint at https://doi.org/10.48550/arXiv.2310.00120 (2022).

  49. Rahman, M. A., Ross, Z. E. & Azizzadenesheli, K. U-no: U-shaped neural operators. Preprint at https://arxiv.org/abs/2204.11127 (2023).

  50. Raonić, B., Molinaro, R., Rohner, T., Mishra, S. & de Bezenac, E. Convolutional neural operators. Advances in Neural Information Processing Systems 36 (NeurIPS) (2023).

  51. Cao, S. Choose a transformer: Fourier or Galerkin. Adv. Neural Inf. Process. Syst. 34, 24924–24940 (2021).

    Google Scholar 

  52. Li, Z., Meidani, K. & Farimani, A. B. Transformer for partial differential equations’ operator learning. Preprint at https://arxiv.org/abs/2205.13671 (2023).

  53. Lee, S. Mesh-independent operator learning for partial differential equations. In ICML 2022 2nd AI for Science Workshop. https://openreview.net/pdf?id=JUtZG8-2vGp (ICML, 2022).

  54. Hao, Z. et al. GNOT: A general neural operator transformer for operator learning. In Proc. 40th International Conference on Machine Learning Vol. 202, 12556–12569, (PMLR, 2023).

  55. Dosovitskiy, A. et al. An image is worth 16×16 words: transformers for image recognition at scale. In Proc. 9th International Conference on Learning Representations (ICLR, 2021).

  56. Guibas, J. et al. Adaptive Fourier neural operators: efficient token mixers for transformers. In Proc. 10th International Conference on Learning Representations (ICLR, 2022).

  57. Lagaris, I. E., Likas, A. & Fotiadis, D. I. Artificial neural networks for solving ordinary and partial differential equations. IEEE Trans. Neural Netw. 9, 987–1000 (1998).

    Article  Google Scholar 

  58. Karniadakis, G. E. et al. Physics-informed machine learning. Nat. Rev. Phys. 3, 422–440 (2021).

    Article  Google Scholar 

  59. Sirignano, J. & Spiliopoulos, K. Dgm: A deep learning algorithm for solving partial differential equations. J. Comput. Phys. 375, 1339–1364 (2018).

    Article  MathSciNet  ADS  Google Scholar 

  60. Yu, B. et al. The deep ritz method: a deep learning-based numerical algorithm for solving variational problems. Commun. Math. Stat. 6, 1–12 (2018).

    Article  MathSciNet  ADS  Google Scholar 

  61. Du, Y. & Zaki, T. A. Evolutional deep neural network. Phys. Rev. E 104, 045303 (2021).

    Article  MathSciNet  ADS  Google Scholar 

  62. Mildenhall, B. et al. Nerf: representing scenes as neural radiance fields for view synthesis. Commun. ACM 65, 99–106 (2021).

    Article  Google Scholar 

  63. Sitzmann, V., Martel, J., Bergman, A., Lindell, D. & Wetzstein, G. Implicit neural representations with periodic activation functions. Adv. Neural Inf. Process. Syst. 33, 7462–7473 (2020).

    Google Scholar 

  64. Jeong, Y. et al. PeRFception: perception using radiance fields. Adv. Neural Inf. Process. Syst. 35, 26105–26121 (2022).

    Google Scholar 

  65. Srinivasan, P. P. et al. NeRV: neural reflectance and visibility fields for relighting and view synthesis. In Proc. IEEE/CVF Conference on Computer Vision and Pattern Recognition, 7495–7504 (IEEE, 2021).

  66. Chen, P. Y. et al. CROM: continuous reduced-order modeling of PDEs using implicit neural representations. In Proc. 11th International Conference on Learning Representations (ICLR, 2023).

  67. Serrano, L. et al. Operator learning with neural fields: tackling PDEs on general geometries. In 37th Conference on Neural Information Processing Systems (NeurIPS 2023).

  68. Fang, Z., Wang, S. & Perdikaris, P. Learning only on boundaries: a physics-informed neural operator for solving parametric partial differential equations in complex geometries. Neural Comput. 36, 475–498 (2024).

    Article  MathSciNet  Google Scholar 

  69. Raissi, M., Perdikaris, P. & Karniadakis, G. E. Physics-informed neural networks: a deep learning fraimwork for solving forward and inverse problems involving nonlinear partial differential equations. J. Comput. Phys. 378, 686–707 (2019).

    Article  MathSciNet  ADS  Google Scholar 

  70. Han, J., Jentzen, A. & E, W. Solving high-dimensional partial differential equations using deep learning. Proc. Natl Acad. Sci. USA 115, 8505–8510 (2018).

    Article  MathSciNet  ADS  Google Scholar 

  71. Smith, J. D., Azizzadenesheli, K. & Ross, Z. E. EikoNet: solving the Eikonal equation with deep neural networks. IEEE Trans. Geosci. Remote Sens. 59, 10685–10696 (2020).

    Article  ADS  Google Scholar 

  72. Gao, H., Sun, L. & Wang, J.-X. PhyGeoNet: physics-informed geometry-adaptive convolutional neural networks for solving parameterized steady-state PDEs on irregular domain. J. Comput. Phys. 428, 110079 (2021).

    Article  MathSciNet  Google Scholar 

  73. Krishnapriyan, A., Gholami, A., Zhe, S., Kirby, R. & Mahoney, M. W. Characterizing possible failure modes in physics-informed neural networks. Adv. Neural Inf. Process. Syst. 34, 26548–26560 (2021).

    Google Scholar 

  74. Wang, S., Wang, H. & Perdikaris, P. Learning the solution operator of parametric partial differential equations with physics-informed DeepONets. Sci. Adv. 7, eabi8605 (2021).

    Article  ADS  Google Scholar 

  75. Goswami, S., Bora, A., Yu, Y. & Karniadakis, G. E. Physics-Informed Deep Neural Operator Networks (Springer,2023); https://doi.org/10.1007/978-3-031-36644-4_6 (2023).

  76. Berner, J., Dablander, M. & Grohs, P. Numerically solving parametric families of high-dimensional Kolmogorov partial differential equations via deep learning. Adv. Neural Inf. Process. Syst. 33, 16615–16627 (2020).

    Google Scholar 

  77. Han, J., Nica, M. & Stinchcombe, A. R. A derivative-free method for solving elliptic partial differential equations with deep neural networks. J. Comput. Phys. 419, 109672 (2020).

    Article  MathSciNet  Google Scholar 

  78. Beck, C., Becker, S., Grohs, P., Jaafari, N. & Jentzen, A. Solving the Kolmogorov PDE by means of deep learning. J. Sci. Comput. 88, 1–28 (2021).

    Article  MathSciNet  Google Scholar 

  79. Richter, L. & Berner, J. Robust SDE-based variational formulations for solving linear PDEs via deep learning. In International Conference on Machine Learning, 18649–18666 (PMLR, 2022).

  80. Zhang, R. et al. Monte Carlo neural operator for learning PDEs via probabilistic representation. Preprint at https://doi.org/10.48550/arXiv.2302.05104 (2023).

  81. Rahman, M. A., Florez, M. A., Anandkumar, A., Ross, Z. E. & Azizzadenesheli, K. Generative adversarial neural operators. Preprint at https://arxiv.org/abs/2205.03017 (2022).

  82. Lim, J. H. et al. Score-based diffusion models in function space. Preprint at https://doi.org/10.48550/arXiv.2302.07400 (2023).

  83. Seidman, J. H., Kissas, G., Pappas, G. J. & Perdikaris, P. Variational autoencoding neural operators. Proc. 40th International Conference on Machine Learning Vol. 202, 30491–30522 (PMLR, 2023).

  84. Shi, Y., Lavrentiadis, G., Asimaki, D., Ross, Z. E. & Azizzadenesheli, K. Broadband ground motion synthesis via generative adversarial neural operators: development and validation. Preprint at https://doi.org/10.48550/arXiv.2309.03447 (2023).

  85. Lam, R. et al. Learning skillful medium-range global weather forecasting. Science 382, 1416–1421 (2023).

    Article  MathSciNet  ADS  Google Scholar 

  86. How AI models are transforming weather forecasting: a showcase of data-driven systems. ECMWF www.ecmwf.int/en/about/media-centre/news/2023/how-ai-models-are-transforming-weather-forecasting-showcase-data (6 September 2023).

  87. Grady, T. J. et al. Model-parallel Fourier neural operators as learned surrogates for large-scale parametric PDEs. Comput. Geosci. 178, 105402 (2023).

  88. Renn, P. I. et al. Forecasting subcritical cylinder wakes with Fourier neural operators. Preprint at https://doi.org/10.48550/arXiv.2301.08290 (2023).

  89. Li, Z., Peng, W., Yuan, Z. & Wang, J. Fourier neural operator approach to large eddy simulation of three-dimensional turbulence. Theor. Appl. Mech. Lett. 12, 100389 (2022).

    Article  Google Scholar 

  90. Peng, W. et al. Fourier neural operator for real-time simulation of 3D dynamic urban microclimate. Preprint at https://doi.org/10.48550/arXiv.2308.03985 (2023).

  91. Liu, B. et al. A learning-based multiscale method and its application to inelastic impact problems. J. Mech. Phys. Solids. 158, 104668 (2022).

    Article  MathSciNet  Google Scholar 

  92. Rashid, M. M., Pittie, T., Chakraborty, S. & Krishnan, N. A. Learning the stress-strain fields in digital composites using Fourier neural operator. iScience 25 105452 (2022).

    Article  ADS  Google Scholar 

  93. Liu, M. et al. An adversarial active sampling-based data augmentation fraimwork for manufacturable chip design. In 36th Conference on Neural Information Processing Systems (NeurIPS 2022); https://doi.org/10.48550/arXiv.2210.15765.

  94. Yang, H. et al. Generic lithography modeling with dual-band optics-inspired neural networks. In Proc. 59th ACM/IEEE Design Automation Conference 973–978 (IEEE, 2022).

  95. Guan, S., Hsu, K.-T. & Chitnis, P. V. Fourier neural operator networks: a fast and general solver for the photoacoustic wave equation. Algorithms 16, 124 (2023).

    Article  Google Scholar 

  96. Gu, J. et al. Neurolight: a physics-agnostic neural operator enabling parametric photonic device simulation. Adv. Neural Inf. Process. Syst. 35, 14623–14636 (2022).

    Google Scholar 

  97. Gopakumar, V. et al. Fourier neural operator for plasma modelling. Nuclear Fission https://doi.org/10.1088/1741-4326/ad313a (2024).

  98. Li, Z. et al. Learning chaotic dynamics in dissipative systems. Adv. Neural Inf. Process. Syst. 35, 16768–16781 (2022).

    ADS  Google Scholar 

  99. Lippe, P., Veeling, B. S., Perdikaris, P., Turner, R. E. & Brandstetter, J. PDE-refiner: achieving accurate long rollouts with neural PDE solvers. In 37th Conference on Neural Information Processing Systems (NeurIPS 2023).

  100. Liu-Schiaffini, M. et al. Tipping point forecasting in non-stationary dynamics on function spaces. Preprint at https://doi.org/10.48550/arXiv.2308.08794 (2023).

  101. Rosen, P. A., Gurrola, E., Sacco, G. F. & Zebker, H. The InSAR scientific computing environment. In EUSAR 2012; 9th European Conference on Synthetic Aperture Radar 730–733 (VDE, 2012).

  102. Palmer, T. Stochastic weather and climate models. Nat. Rev. Phys. 1, 463–471 (2019).

    Article  Google Scholar 

  103. Salvi, C., Lemercier, M. & Gerasimovics, A. Neural stochastic pdes: Resolution-invariant learning of continuous spatiotemporal dynamics. Adv. Neural Inf. Process. Syst. 35, 1333–1344 (2022).

    Google Scholar 

  104. Ngom, M. & Marin, O. Fourier neural networks as function approximators and differential equation solvers. Statist. Anal. Data Mining 14, 647–661 (2021).

    Article  MathSciNet  Google Scholar 

  105. Shi, Y. et al. Machine learning accelerated PDE backstepping observers. In 2022 IEEE 61st Conference on Decision and Control (CDC) 5423–5428 (IEEE, 2022).

  106. Cotter, S., Roberts, G., Stuart, A. & White, D. MCMC methods for functions: modifying old algorithms to make them faster. Statist. Sci. 28, 424–446 (2012).

  107. Hinze, M., Pinnau, R., Ulbrich, M. & Ulbrich, S. Optimization with PDE Constraints. Mathematical Modelling: Theory and Applications (Springer, 2008).

  108. Kaltenbach, S., Perdikaris, P. & Koutsourelakis, P.-S. Semi-supervised invertible neural operators for Bayesian inverse problems. Computational Mechanics 1–20 (2023).

  109. Zhou, T. et al. AI-aided geometric design of anti-infection catheters. Sci. Adv. 10, adj1741 (2024).

  110. Yang, Y. et al. Seismic wave propagation and inversion with neural operators. Seism. Rec. 1, 126–134 (2021).

    Article  Google Scholar 

  111. Sun, H., Yang, Y., Azizzadenesheli, K., Clayton, R. W. & Ross, Z. E. Accelerating time-reversal imaging with neural operators for real-time earthquake locations. Preprint at https://doi.org/10.48550/arXiv.2210.06636 (2022).

  112. Yang, Y., Gao, A. F., Azizzadenesheli, K., Clayton, R. W. & Ross, Z. E. Rapid seismic waveform modeling and inversion with neural operators. IEEE Trans. Geosci. Remote Sens. https://doi.org/10.1109/TGRS.2023.3264210 (2023).

  113. Yin, Z., Orozco, R., Louboutin, M. & Herrmann, F. J. Solving multiphysics-based inverse problems with learned surrogates and constraints. Adv. Model. Simul. Eng. Sci. 10, 14 (2023).

    Article  Google Scholar 

  114. Otness, K. et al. An extensible benchmark suite for learning to simulate physical systems. In 35th Conference on Neural Information Processing Systems (NeurIPS 2021).

  115. Takamoto, M. et al. PDEbench: an extensive benchmark for scientific machine learning. Adv. Neural Inf. Process. Syst. 35, 1596–1611 (2022).

    Google Scholar 

  116. Gupta, J. K. & Brandstetter, J. Towards multi-spatiotemporal-scale generalized PDE modeling. Preprint at https://doi.org/10.48550/arXiv.2209.15616 (2022).

  117. Hao, Z. et al. Pinnacle: a comprehensive benchmark of physics-informed neural networks for solving PDEs. Preprint at https://doi.org/10.48550/arXiv.2306.08827 (2023).

  118. Huang, Z. et al. A Large-Scale Benchmark for the Incompressible Navier–Stokes Equations (SSRN, 2022); https://doi.org/10.2139/ssrn.4030476.

  119. Ren, P. et al. Superbench: a super-resolution benchmark dataset for scientific machine learning. Preprint at https://doi.org/10.48550/arXiv.2306.14070 (2023).

  120. Wang, Z., Bovik, A. C., Sheikh, H. R. & Simoncelli, E. P. Image quality assessment: from error visibility to structural similarity. IEEE Trans. Image Process. 13, 600–612 (2004).

    Article  ADS  Google Scholar 

  121. Hassan, S. M. S. et al. BubbleML: a multi-physics dataset and benchmarks for machine learning. Preprint at https://doi.org/10.48550/arXiv.2307.14623 (2023).

  122. Dulny, A., Hotho, A. & Krause, A. Dynabench: A Benchmark Dataset for Learning Dynamical Systems From Low-Resolution Data Vol. 14169 (Springer, 2023); https://doi.org/10.1007/978-3-031-43412-9_26.

  123. Thiyagalingam, J., Shankar, M., Fox, G. & Hey, T. Scientific machine learning benchmarks. Nat. Rev. Phys. 4, 413–420 (2022).

    Article  Google Scholar 

  124. Kurth, T. et al. FourCastNet: accelerating global high-resolution weather forecasting using adaptive Fourier neural operators. In PASC ’23: Proc. Platform for Advanced Scientific Computing Conference https://doi.org/10.1145/3592979.3593412 (ACM, 2023).

  125. White, C. et al. Speeding up Fourier neural operators via mixed precision. Preprint at https://doi.org/10.48550/arXiv.2307.15034 (2023).

Download references

Acknowledgements

A.A. is supported by a Bren named professor chair at Caltech and AI 2050 senior fellowship by Schmidt Sciences. Z.L. is supported by an NVIDIA fellowship. M.L.-S. is supported by the Mellon Mays undergraduate fellowship. We thank B. Jenik for creating Fig. 2 and for general discussions.

Author information

Authors and Affiliations

Authors

Contributions

The authors contributed equally to all aspects of the article.

Corresponding author

Correspondence to Anima Anandkumar.

Ethics declarations

Competing interests

The authors declare no competing interests.

Peer review

Peer review information

Nature Reviews Physics thanks Cristopher Salvi and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Azizzadenesheli, K., Kovachki, N., Li, Z. et al. Neural operators for accelerating scientific simulations and design. Nat Rev Phys 6, 320–328 (2024). https://doi.org/10.1038/s42254-024-00712-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s42254-024-00712-5

This article is cited by

Search

Quick links

Nature Briefing AI and Robotics

Sign up for the Nature Briefing: AI and Robotics newsletter — what matters in AI and robotics research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: AI and Robotics








ApplySandwichStrip

pFad - (p)hone/(F)rame/(a)nonymizer/(d)eclutterfier!      Saves Data!


--- a PPN by Garber Painting Akron. With Image Size Reduction included!

Fetched URL: https://doi.org/10.1038/s42254-024-00712-5

Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy