时 间: 2022-12-21 14:45 — 15:30
Neural network-based numerical methods for differential equations have been widely developed in recent years. In this work, we study deep neural networks (DNNs) for solving high-dimensional evolution equations with oscillatory solutions. Different from other existing methods (e.g., PINNs) that deal with time and space variables simultaneously, we propose a deep adaptive basis Galerkin (DABG) method which adopts the spectral-Galerkin method for time variable by a tensor-product basis for oscillatory solutions and the network-based method for high-dimensional space variables. The proposed method can lead to a linear system of differential equations having unknown DNNs that can be trained via the loss function. We establish estimates of the solution error, showing that if the true solution is a Barron-type function, the error bound converges to zero as M = O(N^p) approaches infinity, where M is the width of the used networks and p is a positive constant. Numerical examples, including high-dimensional linear parabolic and hyperbolic equations, and a nonlinear Allen–Cahn equation are presented to demonstrate that the performance of the proposed DABG method is better than compared existing methods.
报告人介绍:2012年毕业于浙江大学数学科学学院,先后在美国华盛顿大学和普渡大学攻读硕士与博士学位,博士导师沈捷教授。之后在新加坡国立大学担任博士后研究员。目前在香港大学从事博士后工作。研究方向主要为微分方程数值解和应用数学中的深度学习方法。