Scaling physics-informed hard constraints with mixture-of-experts
1
Scientific Achievement
We scale PDE-constrained optimization using mixture-of-experts to enforce differential equation constraints into neural networks (NNs) with a high degree of accuracy. Our new scaled approach is significantly faster and can solve much more challenging problems.
Significance and Impact
Partial differential equations (PDEs) are crucial for describing the complex phenomena of climate dynamics, and numerous other energy-related areas. NNs provide a way to approximate solutions to such systems much faster than numerical methods, but current approaches only enforce physical constraints approximately and are not as accurate – we address this key problem through our method and scale the method to handle larger, much more complex systems.
Technical Approach
Schematic showing our method with a scaled differentiable layer, which can be added on top of any NN architecture. We scale the differentiable layer using mixture-of-experts, enabling us to scale to larger and finer mesh discretizations to solve spatiotemporal problems.
PI(s)/Facility Lead(s): Lenny Oliker (LBL)
Collaborating Institutions: UC Berkeley, LBNL
ASCR Program: SciDAC RAPIDS2
ASCR PM: Kalyan Perumalla (SciDAC RAPIDS2)
Publication: N. Chalapathi, Y. Du, A. S. Krishnapriyan. International Conference on Learning Representations (2024)