SEPARABLE PHYSICS-INFORMED NEURAL NETWORKS FOR SOLVING ELASTICITY PROBLEMS
- Authors: Es'kin V.A1,2, Davydov D.V3,2, Gur'eva J.V2, Malkhanov A.O2, Smorkalov M.E2,4
-
Affiliations:
- University of Nizhny Novgorod
- Huawei Nizhny Novgorod Research Center
- Mechanical Engineering Research Institute of Russian Academy of Sciences
- Skolkovo Institute of Science and Technology
- Issue: Vol 65, No 9 (2025)
- Pages: 1581-1596
- Section: Computer science
- URL: https://genescells.com/0044-4669/article/view/695400
- DOI: https://doi.org/10.31857/S0044466925090107
- ID: 695400
Cite item
Abstract
Abstract –A method for solving elasticity problems based on separable physics-informed neural networks (SPINN) in conjunction with the deep energy method (DEM) is presented. Numerical experiments have been carried out for a number of problems showing that this method has a significantly higher convergence rate and accuracy than the vanilla physics-informed neural networks (PINN) and even SPINN based on a system of partial differential equations (PDEs). In addition, using the SPINN in the framework of DEM approach it is possible to solve problems of the linear theory of elasticity on complex geometries, which is unachievable with the help of PINNs in frames of partial differential equations. Considered problems are very close to the industrial problems in terms of geometry, loading, and material parameters. Bibl. 61. Fig. 6. Tabl. 8.
About the authors
V. A Es'kin
University of Nizhny Novgorod; Huawei Nizhny Novgorod Research Center
Author for correspondence.
Email: vasiliy.eskin@gmail.com
Nizhny Novgorod, Russia; Nizhny Novgorod, Russia
D. V Davydov
Mechanical Engineering Research Institute of Russian Academy of Sciences; Huawei Nizhny Novgorod Research Center
Email: davidovdan274@yandex.ru
Nizhny Novgorod, Russia; Nizhny Novgorod, Russia
J. V Gur'eva
Huawei Nizhny Novgorod Research Center
Email: gureva-yulya@list.ru
Nizhny Novgorod, Russia
A. O Malkhanov
Huawei Nizhny Novgorod Research Center
Email: alexey.malkhanov@gmail.com
Nizhny Novgorod, Russia
M. E Smorkalov
Huawei Nizhny Novgorod Research Center; Skolkovo Institute of Science and Technology
Email: smorkalovne@gmail.com
Nizhny Novgorod, Russia; Moscow, Russia
References
- L. Alzubaidi, J. Zhang, A.J. Humaidi, A.A. Dujaili, Y. Duan, O.A. Shamma, J. Santamar´ıa, M.A. Fadhel, M.A. Amidie, and L. Farhan, Review of deep learning: concepts, CNN architectures, challenges, applications, future directions. Springer Inter. Publ., 2021. https://doi.org/10.1186/s40537-021-00444-8
- Y. LeCun, Y. Bengio, and G. Hinton, “Deep learning,” Nature 521(7553), 436–444 (2015). http://www.nature.com/ articles/nature14539
- P. Linardatos, V. Papastefanopoulos, and S. Kotsiantis, “Explainable AI: A Review of Machine Learning Interpretability Methods,” Entropy 23(1) (2021). https://www.mdpi.com/1099-4300/23/1/18
- S. Khan, M. Naseer, M. Hayat, S. W. Zamir, F.S. Khan, and M. Shah, “Transformers in Vision: A Survey,” ACM Comput. Surv. 54(10), (2022). https://doi.org/10.1145/3505244
- A. Vaswani, S. Bengio, E. Brevdo, F. Chollet, A. N. Gomez, S. Gouws, L. Jones, L. Kaiser, N. Kalchbrenner, N. Parmar, R. Sepassi, N. Shazeer, and J. Uszkoreit, “Tensor2Tensor for Neural Machine Translation,” 2018. https://arxiv.org/abs/1803.07416
- Q. Wang, B. Li, T. Xiao, J. Zhu, C. Li, D. F. Wong, and L. S. Chao, “Learning Deep Transformer Models for Machine Translation,” 2019. https://arxiv.org/abs/1906.01787
- S. Yao and X. Wan, “Multimodal transformer for multimodal machine translation,” in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Online: Association for Computational Linguistics, Jul. 2020, p. 4346–4350. https://aclanthology.org/2020.acl-main.400
- A. Ramesh, M. Pavlov, G. Goh, S. Gray, C. Voss, A. Radford, M. Chen, and I. Sutskever, “Zero-Shot Text-to-Image Generation,” 2021. https://arxiv.org/abs/2102.12092
- S. Frolov, T. Hinz, F. Raue, J. Hees, and A. Dengel, “Adversarial text-to-image synthesis: A review,” Neural Networks 144, 187–209 (2021). https://www.sciencedirect.com/science/article/pii/S0893608021002823
- D. Silver, T. Hubert, J. Schrittwieser, I. Antonoglou, M. Lai, A. Guez, M. Lanctot, L. Sifre, D. Kumaran, T. Graepel, T. Lillicrap, K. Simonyan, and D. Hassabis, “A general reinforcement learning algorithm that masters chess, shogi, and Go through self-play,” Science 362(6419), 1140–1144 (2018). https://www.science.org/doi/abs/10.1126/science.aar6404
- J. Schrittwieser, I. Antonoglou, T. Hubert, K. Simonyan, L. Sifre, S. Schmitt, A. Guez, E. Lockhart, D. Hassabis, T. Graepel, T. Lillicrap, and D. Silver, “Mastering Atari, Go, chess and shogi by planning with a learned model,” Nature 588(7839), 604–609 (2020). http://www.nature.com/articles/s41586-020-03051-4
- L. Ouyang, J. Wu, X. Jiang, D. Almeida, C. L. Wainwright, P. Mishkin, C. Zhang, S. Agarwal, K. Slama, A. Ray, J. Schulman, J. Hilton, F. Kelton, L. Miller, M. Simens, A. Askell, P. Welinder, P. Christiano, J. Leike, and R. Lowe, “Training language models to follow instructions with human feedback,” 2022. https://arxiv.org/abs/2203.02155
- M. Raissi, P. Perdikaris, and G. E. Karniadakis, “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,” J. Comput. Phys. 378, 686–707 (2019). https://linkinghub.elsevier.com/retrieve/pii/S0021999118307125
- S. Wang, S. Sankaran, and P. Perdikaris, “Respecting causality is all you need for training physics-informed neural networks,” 2022, arXiv:2203.07404. http://arxiv.org/abs/2203.07404
- H. Wang, X. Qian, Y. Sun, and S. Song, “A Modified Physics Informed Neural Networks for Solving the Partial Differential Equation with Conservation Laws,” https://ssrn.com/abstract=4274376
- V.A. Es’kin, D.V. Davydov, E.D. Egorova, A.O. Malkhanov, M.A. Akhukov, and M.E. Smorkalov, “About Modifications of the Loss Function for the Causal Training of Physics-Informed Neural Networks,” Doklady Mathematics 110(S1), S172–S192 (2024). [Online]. Available: https://link.springer.com/10.1134/S106456242460194X
- S. Wang, S. Sankaran, H. Wang, and P. Perdikaris, “An expert’s guide to training physics-informed neural networks,” 2023, arXiv:2308.08468.
- L. Lu, P. Jin, G. Pang, Z. Zhang, and G.E. Karniadakis, “Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators,” Nature Machine Intelligence 3(3), 218–229 (2021). https://doi.org/10.1038%2Fs42256-021-00302-5
- Z. Li, D. Z. Huang, B. Liu, and A. Anandkumar, “Fourier Neural Operator with Learned Deformations for PDEs on General Geometries,” 2022, arXiv:2207.05209. https://arxiv.org/abs/2207.05209
- M.A. Krinitskiy, V.M. Stepanenko, A.O. Malkhanov, and M.E. Smorkalov, “A General Neural-NetworksBased Method for Identification of Partial Differential Equations, Implemented on a Novel AI Accelerator,” Supercomputing Frontiers and Innovations 9(3) (2022). https://superfri.org/index.php/superfri/article/view/ 439
- V. Fanaskov and I. Oseledets, “Spectral Neural Operators,” 2022, arXiv:2205.10573. https://arxiv.org/abs/2205. 10573
- O. Ovadia, A. Kahana, P. Stinis, E. Turkel, and G.E. Karniadakis, “ViTO: Vision Transformer-Operator,” mar 2023, arXiv:2303.08891. http://arxiv.org/abs/2303.08891
- H. Jin, E. Zhang, B. Zhang, S. Krishnaswamy, G.E. Karniadakis, and H.D. Espinosa, “Mechanical characterization and inverse design of stochastic architected metamaterials using neural operators,” 2023, arXiv:2311.13812.
- M. Raissi, P. Perdikaris, and G.E. Karniadakis, “Physics Informed Deep Learning (Part I): Data-driven Solutions of Nonlinear Partial Differential Equations,” Part I, 1–22, arXiv:1711.10561v1. https://arxiv.org/abs/1711.10561
- “Physics Informed Deep Learning (Part II): Data-driven Discovery of Nonlinear Partial Differential Equations,” Part II, 1–19, arXiv:1711.10566v1. https://arxiv.org/abs/1711.10566
- S. Cai, Z. Wang, F. Fuest, Y.J. Jeon, C. Gray, and G.E. Karniadakis, “Flow over an espresso cup: inferring 3-d velocity and pressure fields from tomographic background oriented schlieren via physics-informed neural networks,” J. Fluid Mech. 915 ( 2021). http://dx.doi.org/10.1017/jfm.2021.135
- D.C. Psichogios and L.H. Ungar, “A hybrid neural network-first principles approach to process modeling,” Aiche Journal 38, 1499–1511 (1992).
- I. Lagaris, A. Likas, and D. Fotiadis, “Artificial neural networks for solving ordinary and partial differential equations,” IEEE Transactions on Neural Networks 9(5), 987–1000 (1998). https://doi.org/10.1109%2F72.712178
- C. Rackauckas, Y. Ma, J. Martensen, C. Warner, K. Zubov, R. Supekar, D. Skinner, Ramadhan, and A. Edelman, “Universal Differential Equations for Scientific Machine Learning,” 1–55, jan 2020, arXiv:2001.04385. http://arxiv. org/abs/2001.04385
- L. Yuan, Y.-Q. Ni, X.-Y. Deng, and S. Hao, “A-PINN: Auxiliary physics informed neural networks for forward and inverse problems of nonlinear integro-differential equations,” J. Comput. Phys. 462, 11260 (2022). https://doi.org/10.1016/j.jcp.2022.111260 https://linkinghub.elsevier. com/retrieve/pii/S0021999122003229
- X. Jin, S. Cai, H. Li, and G.E. Karniadakis, “NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations,” J. Comput. Phys. 426, 109951 (2021). https://www.sciencedirect.com/science/article/pii/S0021999120307257
- L. Zhao, Z. Li, Z. Wang, B. Caswell, J. Ouyang, and G.E. Karniadakis, “Activeand transfer-learning applied to microscale-macroscale coupling to simulate viscoelastic flows,” J. Comput. Phys. 427, 110069 (2021), arXiv:2005.04382. https://doi.org/10.1016/j.jcp.2020.110069
- E. Kharazmi, Z. Zhang, and G.E. Karniadakis, “hp-VPINNs: Variational physics-informed neural networks with domain decomposition,” Comput. Meth. in Appl. Mech. and Engineer. 374, 113547 (2021), arXiv:2003.05385. https://doi.org/10.1016/j.cma.2020.113547
- L. Yang, X. Meng, and G.E. Karniadakis, “B-PINNs: Bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data,” J. Comput. Phys. 425, 109913 (2021), arXiv:2003.06097. https://doi.org/10.1016/j.jcp.2020.109913
- S. Cuomo, V.S. di Cola, F. Giampaolo, G. Rozza, M. Raissi, and F. Piccialli, “Scientific Machine Learning through Physics-Informed Neural Networks: Where we are and What’s next,” jan 2022, arXiv:2201.05624. http://arxiv.org/abs/2201.05624
- G. Pang, M. D’Elia, M. Parks, and G.E. Karniadakis, “nPINNs: Nonlocal physics-informed neural networks for a parametrized nonlocal universal Laplacian operator. Algorithms and applications,” J. Comput. Phys. 422, 109760 (2020), arXiv:2004.04276. https://doi.org/10.1016/j.jcp.2020.109760
- R.G. Patel, I. Manickam, N.A. Trask, M.A. Wood, M. Lee, I. Tomas, and E.C. Cyr, “Thermodynamically consistent physics-informed neural networks for hyperbolic systems,” J. Comput. Phys. 449, 110754 (2022), arXiv:2012.05343. https://doi.org/10.1016/j.jcp.2021.110754 https://linkinghub.elsevier.com/retrieve/pii/S0021999121006495
- B. Liu, Y. Wang, T. Rabczuk, T. Olofsson, and W. Lu, “Multi-scale modeling in thermal conductivity of polyurethane incorporated with phase change materials using physics-informed neural networks,” Renewable Energy 220, 119565 (2024). https://www.sciencedirect.com/science/article/pii/S0960148123014805
- S. Cai, Z. Mao, Z. Wang, M. Yin, and G.E. Karniadakis, “Physics-informed neural networks (PINNs) for fluid mechanics: a review,” Acta Mechanica Sinica 37(12), 1727–1738 (2021). https://link.springer.com/10.1007/s10409021-01148-1
- G. Lin, P. Hu, F. Chen, X. Chen, J. Chen, J. Wang, and Z. Shi, “BINet: Learning to Solve Partial Differential Equations with Boundary Integral Networks,” 1–27, oct 2021, arXiv:2110.00352. http://arxiv.org/abs/2110.00352
- Q. He, D. Barajas-Solano, G. Tartakovsky, and A.M. Tartakovsky, “Physics-informed neural networks for multiphysics data assimilation with application to subsurface transport,” Advances in Water Resources 141, 103610 (2020). https://linkinghub.elsevier.com/retrieve/pii/S0309170819311649
- E. Haghighat, M. Raissi, A. Moure, H. Gomez, and R. Juanes, “A physics-informed deep learning framework for inversion and surrogate modeling in solid mechanics,” Comput. Meth. in Appl. Mech. and Engineer. 379, 113741 (2021). https://www.sciencedirect.com/science/article/pii/S0045782521000773
- Z. Meng, Q. Qian, M. Xu, B. Yu, A.R. Yldz, and S. Mirjalili, “Pinn-form: A new physics-informed neural network for reliability analysis with partial differential equation,” Comput. Meth. in Appl. Mech. and Engineer. 414, 116172 (2023). https://www.sciencedirect.com/science/article/pii/S0045782523002967
- E. Samaniego, C. Anitescu, S. Goswami, V. Nguyen-Thanh, H. Guo, K. Hamdia, X. Zhuang, and T. Rabczuk, “An energy approach to the solution of partial differential equations in computational mechanics via machine learning: Concepts, implementation and applications,” Comput. Meth. in Appl. Mech. and Engineer. 362, 112790 (2020). https://www.sciencedirect.com/science/article/pii/S0045782519306826
- L. Ning, Z. Cai, H. Dong, Y. Liu, and W. Wang, “A peridynamic-informed neural network for continuum elastic displacement characterization,” Comput. Meth. in Appl. Mech. and Engineer. 407, 115909 (2023). https://www.sciencedirect.com/science/article/pii/S0045782523000324
- W. Hao, L. Tan, X. Yang, D. Shi, M. Wang, G. Miao, and Y. Fan, “A physicsinformed machine learning approach for notch fatigue evaluation of alloys used in aerospace,” International Journal of Fatigue 170, 107536 (2023). https://www.sciencedirect.com/science/article/pii/S0142112323000373
- B. Moseley, A. Markham, and T. Nissen-Meyer, “Solving the wave equation with physics-informed deep learning,” jun 2020, arXiv:2006.11894. http://arxiv.org/abs/2006.11894
- L. Ning, Z. Cai, H. Dong, Y. Liu, and W. Wang, “Physics-informed neural network frameworks for crack simulation based on minimized peridynamic potential energy,” Comput. Meth. in Appl. Mech. and Engineer. 417, 116430 (2023). https://www.sciencedirect.com/science/article/pii/S0045782523005546
- A. Harandi, A. Moeineddin, M. Kaliske, S. Reese, and S. Rezaei, “Mixed formulation of physics-informed neural networks for thermo-mechanically coupled systems and heterogeneous domains,” Inter. J. Numeric. Meth. Engineer. Nov. 2023. http://dx.doi.org/10.1002/nme.7388
- J. Cho, S. Nam, H. Yang, S.-B. Yun, Y. Hong, and E. Park, “Separable physics-informed neural networks,” 2023, arXiv:2306.15969.
- L.D. Landau and E.M. Lifshitz, Theory of Elasticity. Volume 7 if Course of Theoretical Physics, 3rd ed. Pergamon press, 1986.
- K. Hornik, M. Stinchcombe, and H. White, “Multilayer feedforward networks are universal approximators,” Neural Networks 2(5), 359–366 (1989). https://www.sciencedirect.com/science/article/pii/0893608089900208
- A. Griewank and A. Walther, Evaluating Derivatives, 2nd ed. Society for Industrial and Applied Mathematics, 2008. https://epubs.siam.org/doi/abs/10.1137/1.9780898717761
- S. Wang, Y. Teng, and P. Perdikaris, “Understanding and Mitigating Gradient Flow Pathologies in Physics-Informed Neural Networks,” SIAM J. Sci. Comput. 43(5), A3055—A3081 (2021). https://doi.org/10.1137/20M1318043
- S. Wang, X. Yu, and P. Perdikaris, “When and why PINNs fail to train: A neural tangent kernel perspective,” J. Comput. Phys. 449, 110768 (2022). https://www.sciencedirect.com/science/article/pii/S002199912100663X
- A.L. Caterini and D.E. Chang, Generic Representation of Neural Networks. Cham: Springer Inter. Publ., 2018, p. 23–34. https://doi.org/10.1007/978-3-319-75304-1_3
- S. Mishra and R. Molinaro, “Estimates on the generalization error of physics-informed neural networks for approximating a class of inverse problems for PDEs,” IMA Journal of Numerical Analysis 42(2), 981–1022 (2021). https://doi.org/10.1093/imanum/drab032
- J. Cho, S. Nam, H. Yang, S.-B. Yun, Y. Hong, and E. Park, “Separable physics-informed neural networks,” Advances in Neural Information Processing Systems, 2023.
- A. Paszke, S. Gross, F. Massa, A. Lerer, J. Bradbury, G. Chanan, T. Killeen, Z. Lin, N. Gimelshein, L. Antiga, A. Desmaison, A. Ko¨pf, E. Yang, Z. DeVito, M. Raison, A. Tejani, S. Chilamkurthy, B. Steiner, L. Fang, J. Bai, and S. Chintala, PyTorch: An Imperative Style, High-Performance Deep Learning Library. Red Hook, NY, USA: Curran Associates Inc., 2019.
- “NVIDIA Modulus v22.09 linear elasticity,” https://docs.nvidia.com/deeplearning/modulus/modulus-v2209/ user_guide/foundational/linear_elasticity.html#:$sim$:text=Modulus%20offers%20the%20capability%20of,a% 20variety%20of%20boundary%20conditions, accessed: 2023-11-21.
- D.P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” 2014. https://arxiv.org/abs/1412.6980
Supplementary files




