1 of 38

Final report:

Implementation and deployment of the piggybacking method to SCALE

Piotr Żmijewski �Under supervision of dr Shin-ichiro Shima

RIKEN R-CCS Internship�December 21st, 2022

2 of 38

  • What is piggybacking?
  • Background;
  • Implementation of the piggybacking method to SCALE-SDM;
  • Results;
  • Online piggybacking method.

3 of 38

PIGGYBACKING

4 of 38

The Piggybacking method - what for:

  • to separate the impact of a physical process from effects of natural variability;
  • to evaluate the impact of cloud dynamics;
  • to significantly reduce the size of simulations ensembles for chaotic systems - reduce the computation cost.

5 of 38

The Piggybacking method

  • Two sets of thermodynamic variables (the potential temperature, water vapor mixing ratio, and all variables describing aerosol, cloud and precipitation particles) in a single cloud simulation;
  • Simultaneously running 2 simulations, in set 1 there is interaction between thermodynamics and dynamics. For set 2 the thermodynamics variables are carried by the flow but do not affect it;
  • Simulations have to be run 2 (4) times in order to find the impact of dynamic.

Grabowski, W. W.: Separating physical impacts from natural variability using piggybacking technique, Adv. Geosci., 49, 105–111,2019.

6 of 38

BACKGROUND

7 of 38

ICMW2020 Isolated Cumulus Congestus

  • Randomness of accumulated precipitation in LES simulations - based on UWLCM and SCALE-SDM;
  • Random noise in initial condition;
  • Random noise in SD initialization method;
  • Random( stochastic) SD collision algorithm.

Shima, S.-I., Kusano, K., Kawano, A., Sugiyama, T., and Kawa-hara, S.: The super-droplet method for the numerical simulation of clouds and precipitation: A particle-based and probabilistic microphysics model coupled with a non-hydrostatic model, Q. J.Roy. Meteor. Soc., 135, 1307–1320, 2009.

8 of 38

Methodology and expectations

  • Limit the influence of dynamics on microphysics, by using predefined velocity field;
    • reduction of initial random noise;
    • possibility to study the difference in accumulated precipitation due to the microphysical scheme( testing multiple factors etc.);
    • random seed evaluation.
  • The algorithm spread should decrease with an increasing number of super-droplets per grid box.
  • Find out the “critical” number of super-droplets over which the randomness is caused only by the randomness in initial conditions.

9 of 38

IMPLEMENTATION

10 of 38

Minor modification in SCALE-SDM:

  • scale-rm/src/Makefile
    • allows for using netCDF inside the mod_user.f90.
  • scale-rm/test/case/warmbubble/ICMW-congestus-2D/run.conf
    • Flag to turn ON/OFF reading history files in order to run simulation as a piggybacker;
    • Option to provide path to history files - eliminates the need to compile the SCALE-SDM, while running many piggybacking simulations with different history fields.

turn_piggy_on = .true.,

path_to_file ='../MASTER/MR/HISTORY_5/', !to modify

hist_file = 'history' !to modify

&HISTITEM item='MOMZ',TINTERVAL = 0.5D0, DATATYPE ="REAL8" /

&HISTITEM item='MOMX',TINTERVAL = 0.5D0, DATATYPE ="REAL8" /

&HISTITEM item='MOMY',TINTERVAL = 0.5D0, DATATYPE ="REAL8" /

11 of 38

Major modification in mod_user.f90 :

  • Opening history files inside subroutine USER_setup:

use scale_process, only: &

PRC_MPIstop

PRC_MPIstop, &

mype => PRC_myrank

if ( turn_piggy_on ) then

write ( num,'(I6.6)' ) mype

!----read histroy from 'path_tp_file' directory where 'hist_file's are

!located for each MPI_rank

ncfile = trim(path_to_file)//trim(hist_file)//".pe"//num//".nc"

write(*,*) ncfile

status = nf90_open(trim(ncfile),nf90_nowrite, ncid)

! ---- 'nt' is needed for proper reading of values from history files

nt = 1

endif

12 of 38

  • Finding coordinates points to be replaced inside subroutine USER_step:

ims = IS

ime = IE

jms = JS

jme = JE

kms = KS

kme = KE

if ( PRC_HAS_W .OR. PRC_PERIODIC_X ) then

imsh = ims

else

imsh = ims - 1 ! including i = IS-1

endif

if ( PRC_HAS_S .OR. PRC_PERIODIC_Y ) then

jmsh = jms

else

jmsh = jms - 1 ! include j = JS-1

endif

kmsh = kms - 1

im = ime - ims + 1

jm = jme - jms + 1

imh = ime - imsh + 1

jmh = jme - jmsh + 1

kmh = kme - kmsh + 1

13 of 38

  • Reading and replacing MOMX/Y/Z values inside subroutine USER_step:

status = nf90_inq_varid(ncid,"MOMX",id_MOMX)

status = nf90_get_var( ncid,id_MOMX,work_x(imsh:ime,jms:jme,KS:KE),� start=(/1,1,1,nt/),count=(/imh,jm,KMAX,1/) )

do i = imsh, ime

do j = jms, jme

do k = KS, KE

MOMX(k,i,j) = work_x(i,j,k)

enddo

enddo

enddo

call COMM_vars8( MOMX,1)

call COMM_wait (MOMX,1)

The detailed solution and description of all commits is available in my SCALE-SDM branch :

https://bitbucket.org/s-shima-lab/scale-sdm/branch/SDM_feature-221111_Zmijewski_offline-piggybacking

14 of 38

RESULTS

15 of 38

ICMW-congestus-2D set-up

  • Adaptation of the setup developed by Lasher-Trapp et al. (2005);
  • The computational domain is 12 km in horizontal and 10 km in vertical;
  • The grid size is 100 m x 100 m - 2D simulation;
  • The total simulation time is 3 hours, the first hour is treated as a spin-up - developing boundary layer turbulence;
  • The surface fluxes are uniform for 1st hour, then surface fluxes have a Gaussian distribution centered in the middle of the domain;
  • The side boundaries are periodic, and the upper boundary is free-slip rigid-lid;
  • The aerosol distribution is based on observation from the RICO campaign by VanZanten et al., 2011.

Lasher-Trapp, S.G., Cooper, W.A. and Blyth, A.M. (2005), Broadening of droplet size distributions from entrainment and mixing in a cumulus cloud. Q.J.R. Meteorol. Soc., 131: 195-220. https://doi.org/10.1256/qj.03.199.

VanZanten, M. C., Stevens, B., Nuijens, L., Siebesma, A. P., Ackerman, A. S., Burnet, F., Cheng, A., Couvreux, F., Jiang, H., Khairoutdinov,M., Kogan, Y., Lewellen, D. C., Mechem, D., Nakamura, K., Noda, A., Shipway, B. J., Slawinska, J., Wang, S., and Wyszogrodzki, A.:Controls on precipitation and cloudiness in simulations of trade-wind cumulus as observed during RICO, Journal of Advances in Modeling

Earth Systems, 3, https://doi.org/10.1029/2011MS000056, 2011.

16 of 38

Results of running SCALE-SDM for 100 times. Simulations differ to each other only with the values of : RANDOM_SEED_SCALE and

RANDOM_NUMBER_SEED.

100 SD per grid box were used.

Same as above but for UWLCM and 900 simulations.

Colored dots highlights chosen drivers for further studies.

SCALE-SDM

UWLCM

17 of 38

Piggybacking SDM over SDM drivers

ref - is a mean/STD of 100 simulations used for finding the drivers.

Error bars used for this and further plots represents the 95% confidence interval calculated based on the:

D* - driver

P* - piggybacker

LR - Low Rain drivers

18 of 38

MR - Mid Rain drivers

HR - High Rain drivers

Using piggybacking reduces the standard deviation of accumulated precipitation

19 of 38

HR12

MR25

HR48

LR68

D-SDM

D-SDM

D-SDM

D-SDM

P-SDM

P-SDM

P-SDM

P-SDM

20 of 38

Piggybacking BULK over SDM drivers

Bulk microphysics schemes:

TOM - TOMITA08(1 moment - mass),�SN14- SN14(2 moment- mass & concentration).

RANDOM_SEED_SCALE is different for each �P simulation with bulk microphysics.

For each P simulation with SDM microphysics:

RANDOM_SEED_SCALE is same and only RANDOM_NUMBER_SEED differ between simulations.

When piggybacking bulk over SDM driver only one piggybacking simulation is required. The standard deviation of accumulated precipitation is almost 0 for this scenario. For SDM some small ensemble is necessary.

&PARAM_ATMOS_PHY_MP_TOMITA08

autoconv_nc = 80.0

/

&nm_mp_sn14_nucleation

c_ccn = 1.05E+8

/

21 of 38

Piggybacking SDM over BULK drivers

TOMITA08

SN14

TOM starts to precipitate earlier than SDM.

SN14 starts to precipitate later than SDM.

22 of 38

Piggybacking SDM over BULK drivers

TOMITA08

SN14

TOM starts to precipitate earlier than SDM.

SN14 starts to precipitate later than SDM.

23 of 38

Piggybacking SDM over BULK drivers

24 of 38

D-TOMITA08

D-SN14

P-SDM

P-SDM

DRIVER

PIGGYBACKER

Piggybacking SDM over BULK drivers

25 of 38

D-SDM

P-SDM

P-SN14

P-TOMITA08

Piggybacking SDM and BULK over SDM driver

All piggybacker simulations shares the same driver SDM - MR5.

All snapshots shows cloud at the 7200 s of simulation time. The number of rain cells differ significantly.

26 of 38

D-SDM

P-SDM

P-SN14

P-TOMITA08

SOME TEXT

27 of 38

Piggybacking SDM and BULK over SDM driver

  • The highest value of accumulated precipitation is obtained by the TOM piggybacker. The <P> is higher than for the SMD(MR5) driver, and starts to precipitate ~800 s earlier.
  • <P> for SN14 is the smallest on from all of the piggybacking approaches. Precipitation starts a bit later than for the driver.
  • SDM precipitate less than its driver, despite the same starting moment of precipitation.

28 of 38

Convergence study of number of SDs per grid box

  • Using 10 SDs per grid box produces more accumulated precipitation than driver.�This agrees well with UWLCM.
  • For SCALE-SDM using 500 or 1000 SD per grid box provides almost the same values of accumulated precipitation. Probably it is converging at this value.
  • UWLCM requires higher number of SDs per grdi box?

29 of 38

Offline piggybacking summary

Conclusions, piggybacking allows for :

What to improve:

  • studying impact of microphysics on accumulated precipitation;
  • reducing the standard deviation of accumulated precipitation;
  • comparing different microphysics schemes within the same velocity field;
  • studying influence of different parameter of microphysics schemes within the same velocity field.

  • saving history files to one file, so it could be read by simulation with any number of MPI ranks;
  • one data format to use with different models;
  • code outside mod_user.f90, somewhere in the scale-rm core, so other cases might use it;
  • while piggybacking turn off MOMX/Y/Z solver;
  • optimize piggybacking method, opening/reading/swapping variables.

30 of 38

Online Piggybacking

31 of 38

The “online” Piggybacking method

  • Run “driver” simulation and save its dynamical variables in RAM memory eg. in /dev/shm;
  • Simultaneously run piggybacker within the same simulation driven by saved variables;
  • Implement this strategy in mod_user.f90;
  • 3D for piggybacking.

Grabowski, W. W.: Separating physical impacts from natural variability using piggybacking technique, Adv. Geosci., 49, 105–111,2019.

32 of 38

IMPLEMENTATION

33 of 38

  • Saving temporary files MOMX/Y/Z to /dev/shm subroutine USER_step:

write ( num_temp,'(F7.1)' ) time

num_temp = repeat( '0', 7-len_trim(adjustl(num_temp))) //adjustl(num_temp)

Temp_momx = trim(temp_file)//"MOMX_mpi_"//num//"_timestep_"//trim(num_temp)//".dat"

Done = trim(temp_file)//"Done_mpi_"//num//"_timestep_"//trim(num_temp)//".dat"

if ( turn_piggy_online .eq. 1) then

file_exist = .true.

do while ( file_exist .ne. .false.)

INQUIRE(FILE=Done_prev, EXIST=file_exist)

end do

file_exist_momx = .true.

do while (file_exist_momx .ne. .false.)

INQUIRE(FILE=MOMX_prev, EXIST=file_exist_momx)

end do

open(mype, file=Temp_momx, form='unformatted',status='unknown')

write(mype) MOMX

close(mype)

MOMX_prev = Temp_momx

open(mype, file=Done)

close(mype)

Done_prev = Done

endif

DRIVER

34 of 38

  • Reading temporary files MOMX/Y/Z from /dev/shm subroutine USER_step:

if ( turn_piggy_online .eq. 2) then

file_exist = .false.

file_exist_momx = .false.

do while ( file_exist .ne. .true.)

INQUIRE(FILE=Done, EXIST=file_exist)

end do

do while(file_exist_momx .ne. .true.)

INQUIRE(FILE=Temp_momx, EXIST=file_exist_momx)

end do

open(mype, file=Temp_momx, form='unformatted',status='old')

read(mype) MOMX

close(mype,status='delete')

open(mype, file=Done)

close(mype,status='delete')

endif

endif

Piggybacker

35 of 38

Sample Results - Online Piggybacking

Visible difference between piggybacking bulk 1m (TOMITA08) and SDM on the same SDM driver.

#PBS -l select=1:ncpus=40:mpiprocs=20

Offline

Online

  • D: 9866 s
  • P: 9882 s
  • D+P: 9819.5s

36 of 38

D-SDM

D-SDM

P-SDM

P-TOM08

DRIVER

PIGGYBACKER

Online Piggybacking

37 of 38

Online piggybacking summary

Conclusions, piggybacking allows for :

What to improve:

  • studying impact of microphysics on accumulated precipitation;
  • comparing different microphysics schemes;
  • studying 3D cases, without storage problem;
  • performing piggybacking simulations faster than offline.

  • prepare 3D cases;
  • conditions based on which files are written or read;
  • while piggybacking turn off MOMX/Y/Z solver;
  • optimize piggybacking method, opening/reading.

38 of 38

Thank you for your attention and for having me here in Japan for the internship.