NEMS / NMMB HOWTO

Version 1, 20 June 2012

Evolving Draft

Overview

In this first version we use the NMMB Preprocessing System (NPS) to create a very small domain over the Alaska panhandle, then ingest 0.5 degree GFS input fields to create initial and boundary conditions, and use these to drive a short NMMB simulation and then visualise with GrADS.

We assume that input data, working directories, including binaries, are already available, so this HOWTO simply describes the mechanics of running an academic case (currently just on zeus)

Although high-level scripts are available to abstract many of the actions, in this case we describe the steps from a lower level perspective, so the user understands a bit of what’s going on beneath the hood.   In this way, when people start to use the higher-level scripts, they’ll have a better idea of what’s happening when things go wrong.  The root directory for binaries, input data and utility data - on zeus - for this example is

HOWTO_SRCDIR=/scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v1

More instructions to follow, but in general I anticipate having users copy from the HOWTO_SRCDIR to their own work directories, and then perform the following steps.  

The intended audience for this HOWTO is the experienced WRF user, who feels comfortable with the basic pre-processing and model execution steps.  Dr. Delia Arnold, co-PI on this project, will likely be the first user (and reviewer) of these documents, once we have the system running on a more accessible computing system.

NMMB Preprocessing System (NPS)

Overview - The NPS is used to define a model domain, and then to ingest meteorological input data (e.g. GFS, NAM, etc.) and transform it into initial conditions (input_domain_01) and lateral boundary conditions (boco.0000, boco.0003, …)

What we’ll do

Create namelist

Edit build_namelists.scr ---- (I will have this already set up correctly, and just briefly explain key fields - maybe I make a link directly to it.)

Run it

./build_namelists.scr

generating namelist.nps

Copy namelist.nps into the NPS subdirectory (remember to do this!  It’s easy to forget to copy this down into the working directory!)

Please note that we could just edit namelists.nps by hand.  build_namelists.scr is higher level and simplifies the creation of the namelist.

Then, we cd down to the NPS subdir

geogrid

mpirun -np 1  ./geogrid.exe   

(may be necessary to load the mpi module, depending on machine - on zeus - module load mpt)

[Note that for real-world cases, especially large cases, these would typically be run on numerous processors and submitted to a job queue for batch execution.  For small, simple cases such as this, we will run everything interactively from the command line, keeping it simple for the first-time user.  More advanced HOWTO’s will start to include the job submission steps.]

Expected output:

NMMB_init/NPS]$ % mpirun -np 1 ./geogrid.exe

 set iproj type to PROJ_ROTLLB:              204

 dykm, dxkm:   0.2000000          0.2000000    

 see gridtype as B

Parsed 10 entries in GEOGRID.TBL

         

         

  compute_nest_locations  

         

 call map_set for domain 1 for PROJ_ROTLLB

  phi lambda        6.900000           3.900000    

Processing domain 1 of 1

 top of process_tile

  Processing XLAT and XLONG

 SW lat,lon ::        47.93869          -142.7847    

 NE lat,lon ::        61.67177          -128.8193    

 start_patch_i, end_patch_i:                1              40

 start_patch_j, end_patch_j:                1              70

 File: ./geo_nmb.d01.dio opened

 writing dx, dy:   0.2000000          0.2000000    

  Processing F and E

  Processing LANDUSEF

And produces geo_nmb.d01.dio

Ungrib

Make links GRIB.AAA GRIB.AAB GRIB.AAC to the GFS GRIB files in --- I guess I need to put some GFS files somewhere ---

Note that the creation of these links can easily be scripted and in the WPS/WRF distribution there is a script link_grib.csh that will make this easy

Make link to the correct Vtable -

ln -s ungrib/Variable_Tables/Vtable.GFS_with_isobaric  Vtable

Run

mpirun -np 1 ./ungrib.exe

producing the three intermediate files

FILE:2012-06-15_00

FILE:2012-06-15_03

FILE:2012-06-15_06

metgrid

mpirun -np 1 ./metgrid.exe

produces

met_nmb.d01.2012-06-15_00:00:00.dio

met_nmb.d01.2012-06-15_03:00:00.dio 

met_nmb.d01.2012-06-15_06:00:00.dio

nemsinterp

mpirun -np 1 ./nemsinterp.exe

Produces:

boco.0000

boco.0003

input_domain_01

These are the lateral boundary conditions (boco) and initial conditions for the NMM-B simulation

(also produces boco.00? and input_domain_01_nemsio, in the NEMSIO format, but those aren’t used in the following steps)

Also note the domain_details_01 file, which, except for providing the number of vertical levels (maybe it should?), provides the information needed to set up the domain information in the namelist for NMM-B.

For this case:

nx:   40

ny:   70

dlmd: 0.200000

dphd: 0.200000

tph0d:   55.000

tlm0d: -137.000

wbd:   -3.900

sbd:   -6.900

SW lat,lon:   47.939 -142.785

NE lat,lon:   61.672 -128.819

[Note that nemsinterp - and possibly metgrid, I can’t remember right now - will not permit overwriting of existing output files, so if you try to run this a second time, you need to remove some of those output files first.  I’ll fill in more detail later]

--------------------------------------------------------

Running NMM-B

Copy the boco.000? boundary conditions files and input_domain_01 to the NMMB run directory.

Edit model_configure [I don’t fully understand this yet, but it appears that this file is replicated in the run directory - there is an identical file called configure_file_01 - and all I know at this point is that both are necessary for the model run.  I’m guessing one might be for the model as a whole and the other for a particular nest, but, for now I don’t know.  Just make sure they’re both there (as identical files.  I just make a symlink from one to the other)]  Again, at some point I’ll make this file available as a link and explain several of the key paraemters

This is small enough to run interactively, so set up 1x1 NMMB domain decomposition and 1 I/O task, then

mpirun -np 2 ./NEMS.x

Note - I’ve tried running this without asynchronous I/O and it hasn’t worked.  It might be that this is just inherent in the NEMS design - an I/O component (and hence at least one task meant for that), an NMM-B component, etc.  So, it appears to me that I need to run this with cores for the NMM-B and at least one more for the I/O (like WRF, the I/O seems to be assigned to highest-numbered tasks).

Produces the following output files:

nmmb_hst_01_bin_0000h_00m_00.00s

nmmb_hst_01_bin_0001h_00m_00.00s

nmmb_hst_01_bin_0002h_00m_00.00s

nmmb_hst_01_bin_0003h_00m_00.00s

nmmb_hst_01_bin_0004h_00m_00.00s

nmmb_hst_01_bin_0005h_00m_00.00s

nmmb_hst_01_bin_0006h_00m_00.00s

nmmb_hst_01_nio_0000h_00m_00.00s

nmmb_hst_01_nio_0000h_00m_00.00s.ctl

nmmb_hst_01_nio_0001h_00m_00.00s

nmmb_hst_01_nio_0001h_00m_00.00s.ctl

nmmb_hst_01_nio_0002h_00m_00.00s

nmmb_hst_01_nio_0002h_00m_00.00s.ctl

nmmb_hst_01_nio_0003h_00m_00.00s

nmmb_hst_01_nio_0003h_00m_00.00s.ctl

nmmb_hst_01_nio_0004h_00m_00.00s

nmmb_hst_01_nio_0004h_00m_00.00s.ctl

nmmb_hst_01_nio_0005h_00m_00.00s

nmmb_hst_01_nio_0005h_00m_00.00s.ctl

nmmb_hst_01_nio_0006h_00m_00.00s

nmmb_hst_01_nio_0006h_00m_00.00s.ctl

Using GrADS, (on zeus, need to load the module - module load grads) we can view the nio (NEMS I/O) files.  For example, fire up GrADS, then

open nmmb_hst_01_nio_0006h_00m_00.00s.ctl

then

d ugrd

or

d temp

or something like that, just to verify output

Will put some graphics in here at some point.

Additional comment - I’m understanding this better, but setting up a domain on a rotated_ll grid has sometimes led me to surprises.  The domain I see after plotting in GrADS isn’t necessarily what I expected.  In the WRF Preprocessing System (WPS), there is a utility, util/plotgrids.exe, which will look at the namelist file and create a crude plot of it, allowing for iterative, trial and error construction of the domain.  Although WPS supposedly allows grids to be created for the NMM core, allowing for rotated_ll coordinate system, the plotgrids utility doesn’t seem to plot it this way.  Having such a utility would be very helpful, and maybe that’s something I can look into, unless there’s already one around.