학술논문

Polynomial differentiation decreases the training time complexity of physics-informed neural networks and strengthens their approximation power
Document Type
article
Source
Machine Learning: Science and Technology, Vol 4, Iss 4, p 045005 (2023)
Subject
SC-PINNs
physics informed neural networks
gradient flow
Sobolev losses
Gauss–Legendre quadratures
polynomial differentiation
Computer engineering. Computer hardware
TK7885-7895
Electronic computers. Computer science
QA75.5-76.95
Language
English
ISSN
2632-2153
Abstract
We present novel approximates of variational losses, being applicable for the training of physics-informed neural networks (PINNs). The formulations reflect classic Sobolev space theory for partial differential equations (PDEs) and their weak formulations. The loss approximates rest on polynomial differentiation realised by an extension of classic Gauss–Legendre cubatures , we term Sobolev cubatures , and serve as a replacement of automatic differentiation . We prove the training time complexity of the resulting Sobolev -PINNs with polynomial differentiation to be less than required by PINNs relying on automatic differentiation. On top of one-to-two order of magnitude speed-up the Sobolev-PINNs are demonstrated to achieve closer solution approximations for prominent forward and inverse, linear and non-linear PDE problems compared to established PINNs.