1 of 1

Neural Spectral Methods: Self-supervised learning in the spectral domain

1

Scientific Achievement

We develop Neural Spectral Methods (NSM), a neural network (NN) technique to solve PDEs, grounded in classical spectral methods. We show that our method outperforms previous ML methods by one or two orders of magnitude in speed and accuracy on multiple problems.

Significance and Impact

Partial differential equations (PDEs) are crucial for describing the complex phenomena of climate dynamics, and numerous other energy-related areas. NNs provide a way to approximate solutions to such systems much faster than numerical methods, but current approaches can have accuracy and efficiency issues. Our method means computational cost is constant regardless of mesh discretization, while retaining accuracy.

Technical Approach

  • We use orthogonal bases to learn PDE solutions as mappings between spectral coefficients, instantiating a spectral-based neural operator.
  • We introduce a spectral loss using Parseval’s identity, which substantially decreases training complexity.

Schematic showing our method, which enables training a NN fully in spectral space, providing greater expressivity and accuracy. This is in contrast to current ML approaches that only train in physical space.

Example results on a reaction-diffusion problem for different grid-size accuracies against speed: We show a numerical solver, FNO (previous ML method), and our method (NSM). We maintain constant speed and accuracy regardless of discretization.

PI(s)/Facility Lead(s): Lenny Oliker (LBL)

Collaborating Institutions: UC Berkeley, LBNL

ASCR Program: SciDAC RAPIDS2

ASCR PM: Kalyan Perumalla (SciDAC RAPIDS2)

Publication: Y. Du, N. Chalapathi, A. S. Krishnapriyan. International Conference on Learning Representations (2024)