1 of 27

MoriNet: A Machine Learning-based Mori-Zwanzig Perspective on Weakly Compressible SPH

Rene Winchenbach

Technical University Munich

2 of 27

Machine Learning and you

Take your simulations

Turn it into linear steps

Feed it all the data

→ Solve Navier Stokes!

… right?

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

2

Neural Network

(Matrix Multiplications)

“Training”

General solution for

Everything everywhere

Simulated with diffSPH

(our differentiable solver)

3 of 27

The Problem

The classic Machine Learning Task:

  • Learn an emulator that replaces a solver…
  • …difficult task and does not always succeed
  • Just feed it more information for better results…
  • …but why/does this work?

Start with some random flow data for training

Neural Network Architecture not important

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

3

4 of 27

Weak Compressibility in SPH

 

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

4

5 of 27

SPH Density: The summation approach

 

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

5

6 of 27

SPH Density: The modern approach

 

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

6

7 of 27

Density as Hidden Information

„Classic“ SPH: Positions define density

„Modern“ SPH: Density independent variable

Two options:

  1. Provide density as an input feature …
    • … and keep track of it
    • … and learn to integrate it over time

2. Ignore density as an input feature

The second approach is the Machine Learning way

But can this work?

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

7

8 of 27

Density as Hidden Information 2

We can setup a simple experiment:

  • Input: Particle Positions at time t
  • Learning Task: Learn the density field

Result:

  • Exact solution! (for the first frame)
  • Generally only learns the rest density as output
  • Rest density minimizes the RMSE of the density field in our simulation
  • For “modern” SPH the network fails to learn the task
  • Can, sometimes, overfit to single steps but not in general

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

8

9 of 27

End of Presentation

9

(not really)

10 of 27

Mori Zwanzig Formalism (very basic overview)

Mori Zwanzig comes from statistical mechanics

Given a complex system of high order

The physical system is First Order Markovian (it only depends on current information)

We use a Reduced Order Model (ROM)

ROM can come in many shapes

This means:

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

10

Markovian Term

Memory Term

Noise Term

11 of 27

Mori Zwanzig Simplism

A pendulum has well defined physics

Given the full state (positions and velocities) we know the next state

This is first order Markovian

ROM: Take a picture

Next state no longer defined

MZ: Given a history of positions we can infer velocity

Next state can be estimated

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

11

12 of 27

Mori Zwanzig Practicism

Full simulation: First Order Markovian

„Modern“ SPH without density information: Incomplete ROM

Mori Zwanzig:

Provide history → Infer full state → Prediction improves

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

12

13 of 27

Mori Zwanzig in practice: Results

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

13

No History

16 Steps History

14 of 27

Flipping the perspective

Mori Zwanzig gives a very theoretical view on Machine Learning

No general consensus on what Mori Zwanzig actually implies

Recall:

  • We are dealing with WCSPH
  • Density variations matter for short timescales
  • Long term behavior converges to mean behavior
  • Temporal Coarse Graining -> We only consider long term behavior!

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

14

15 of 27

Flipping the perspective

Mori Zwanzig gives a very theoretical view on Machine Learning

No general consensus on what Mori Zwanzig actually implies

Recall:

We are dealing with WCSPH -

Density variations matter for short timescales -

Long term behavior converges to mean behavior -

Temporal Coarse Graining → We only consider long term behavior! -

15

16 of 27

Flipping the perspective: Results

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

16

No History

Result after 32 Steps

17 of 27

Emulator Superiority: The Machine Learning Way

Can an emulator be better than its reference?

„„„„„„Yes!““““““

… some caveats apply to this:

  • Only in narrow scopes
  • Not applicable to general solutions
  • Not guaranteed to be possible

In this case:

  • Short term behavior noisy
  • Mean behavior has meaning
  • NNs great at denoising and learning smooth results
  • Also network massively more expensive per step

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

17

18 of 27

What does this behavior look like?

Dr. rer. nat. Erika Mustermann (TUM) | kann beliebig erweitert werden | Infos mit Strich trennen

18

19 of 27

Closure Modelling

Dr. rer. nat. Erika Mustermann (TUM) | kann beliebig erweitert werden | Infos mit Strich trennen

19

20 of 27

Why Machine Learning to begin with?

Why do we want Machine Learning anyways?

  • GPU acceleration if our solver is CPU based
    • Not applicable as most solvers are GPU based for SPH
  • Learning specific solutions to narrow problems
    • We want a general solution without worrying about out of band generalization
  • Closure Modelling/Corrector/Denoising Setups
    • Difficult to train due to coupling of SPH and Neural Networks
  • Differentiable Solvers!
    • So why not just write a differentiable SPH solver?

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

20

21 of 27

21

22 of 27

Differentiable SPH Solvers can do it all:

Shape Optimization:

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

22

Optimize Interference

at point in space

23 of 27

Inverse Problems:

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

23

Match Initial Conditions to match trajectory

24 of 27

Parameter Optimization:

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

24

25 of 27

Closure Modelling

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

25

Neural Network

+

Explicit Euler

RK4

Solver In The Loop

26 of 27

Loss based Physics

Define Problem using Loss

Evolving Physics == Minimize Loss

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

26

27 of 27

Conclusions

Machine Learning can be analyzed from a theoretical perspective

Emulators can be better than their reference data …

… in some cases

Closure Modelling requires tight integration of solver and network

Adjoint Problems don‘t require Neural Networks to solve

Check out our fully differentiable SPH Solver:

diffSPH

MoriNet: A Mori-Zwanzig perspective on weakly compressible SPH

27