NEMS / NMMB HOWTO

Version 2, 19 October 2012

Evolving Draft

Overview

In this second version of the HOWTO, we use the NMMB Preprocessing System (NPS) to create a very small domain over  Alaska, then ingest 0.5 degree GFS input fields to create initial and boundary conditions, and use these to drive a short NMMB simulation and then visualise with GrADS.

We assume that input data, working directories, including binaries, are already available, so this HOWTO simply describes the mechanics of running an academic case (currently just on zeus)

Although high-level scripts are available to abstract many of the actions, in this case we describe the steps from a lower level perspective, so the user understands a bit of what’s going on beneath the hood.   In this way, when people start to use the higher-level scripts, they’ll have a better idea of what’s happening when things go wrong.  

The intended audience for this HOWTO is the experienced WRF user, who feels comfortable with the basic pre-processing and model execution steps.  

Setup Procedures

The root directory for binaries, input data and utility data - on zeus - for this example is

HOWTO_SRCDIR=/scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v2

Copy the two subdirectories, NMMB and NMMB_init to your local working directory.  You can leave the InputData subdirectory as it is - you’ll just be linking into here from your own directory.  

From your working directory:

cp -a /scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v2/NMMB*  .

A similar directory is available, which additionally contains all of the output files produced by successful completion of this exercise.  In other words, after completing this tutorial your directories should look a lot like this.  You can use this to compare your outputs or possibly skip ahead to steps for better learning or debugging.  This directory is available at

/scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v2/ReferenceDirectories

I have started this sequence with no modules loaded (module purge) so that users can see what modules I needed to load for particular tasks.

NMMB Preprocessing System (NPS)

Overview - The NPS is used to define a model domain, and then to ingest meteorological input data (e.g. GFS, NAM, etc.) and transform it into initial conditions (input_domain_01) and lateral boundary conditions (boco.0000, boco.0003, …)

What we’ll do

Create namelist

Edit build_namelists.scr ---- I have this set up for a 100x100x31 point grid at 0.2 degree resolution over part of Alaska.

Run it

./build_namelists.scr

which prints the following to stdout:

log_glob .false.
resulting wbd is -10.0
resulting sbd is -10.0
dt_num
dt_denom
wbdnum 48
sbdnum 48
newdx .06666666666666666666
newdy .06666666666666666666
.false.
COPY THE FULL ISOBARIC VERSION

and  generates namelist.nps

Note that the resulting namelist.nps has a couple of hard-coded paths to Matt Pyle’s filesystem for geog data.  I’m not sure how this gets in here, but in a user-friendly environment, we would want this pointing to the user’s archive of the geog data.

Copy namelist.nps into the NPS subdirectory (remember to do this!  It’s easy to forget to copy this down into the working directory!)    

Please note that we could just edit namelists.nps by hand, but build_namelists.scr is higher level and simplifies the creation of the namelist.

Then, we cd down to the NPS subdir

Load the mpt module for MPI:

module load mpt

In the following steps, I need to investigate graphics and diagnostic utilities for displaying the outputs of each step, like WPS does.  As it stands now, I don’t really get a visualisation of the domain until after I’ve gone all the way through, creating GrADS compatible outputs by running NMMB.

geogrid

mpirun -np 1  ./geogrid.exe   

[Note that for real-world cases, especially large cases, these would typically be run on numerous processors and submitted to a job queue for batch execution.  For small, simple cases such as this, we will run everything interactively from the command line, keeping it simple for the first-time user.  More advanced HOWTO’s will start to include the job submission steps.]

Expected output:

bash-4.1$ mpirun -np 1 ./geogrid.exe
set iproj type to PROJ_ROTLLB:          204
dykm, dxkm:   0.2000000      0.2000000    
see gridtype as B
Parsed 10 entries in GEOGRID.TBL
   
   
 compute_nest_locations  
   
call map_set for domain 1 for PROJ_ROTLLB
 phi lambda    9.900001       9.900001    
Processing domain 1 of 1
top of process_tile
 Processing XLAT and XLONG
SW lat,lon ::    48.97834      -164.9543    
NE lat,lon ::    67.87898      -123.2708    
start_patch_i, end_patch_i:            1         100
start_patch_j, end_patch_j:            1         100
File: ./geo_nmb.d01.dio opened
writing dx, dy:   0.2000000      0.2000000    
 Processing F and E
 Processing LANDUSEF
.

.

.

And produces geo_nmb.d01.dio

Ungrib

Make links GRIBFILE.AAA GRIBFILE.AAB GRIBFILE.AAC to the GFS GRIB files in  ---

/scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v1/InputData/GFS_0.5/2012061500

Note that the creation of these links can easily be scripted and in the WPS/WRF distribution there is a script link_grib.csh that will make this easy.  But, for only three files, I just do them individually:

ln -s /scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v2/InputData/GFS_0.5/2012061500/fh.0000_tl.press_gr.0p5deg   GRIBFILE.AAA

ln -s /scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v2/InputData/GFS_0.5/2012061500/fh.0003_tl.press_gr.0p5deg   GRIBFILE.AAB

ln -s /scratch2/portfolios/NCEPDEV/meso/save/Don.Morton/NMMB_HOWTO_v2/InputData/GFS_0.5/2012061500/fh.0006_tl.press_gr.0p5deg   GRIBFILE.AAC

Make link to the correct Vtable -

ln   -s ungrib/Variable_Tables/Vtable.GFS_with_isobaric    Vtable

Run

mpirun -np 1 ./ungrib.exe

producing the three intermediate files

FILE:2012-06-15_00

FILE:2012-06-15_03

FILE:2012-06-15_06

metgrid

mpirun -np 1 ./metgrid.exe

produces

met_nmb.d01.2012-06-15_00:00:00.dio

met_nmb.d01.2012-06-15_03:00:00.dio 

met_nmb.d01.2012-06-15_06:00:00.dio

nemsinterp

mpirun -np 1 ./nemsinterp.exe

Produces:

boco.0000

boco.0003

input_domain_01

These are the lateral boundary conditions (boco) and initial conditions for the NMM-B simulation

(also produces boco.00? and input_domain_01_nemsio, in the NEMSIO format, but those aren’t used in the following steps)

Also note the domain_details_01 file, which, except for providing the number of vertical levels (maybe it should?), provides the information needed to set up the domain information in the namelist for NMM-B.

For this case:

nx:  100                                                    

ny:  100                                                    

dlmd: 0.200000                                              

dphd: 0.200000                                              

tph0d:   60.000                                            

tlm0d: -150.000                                            

wbd:   -9.900                                              

sbd:   -9.900                                              

SW lat,lon:   48.978 -164.954                              

NE lat,lon:   67.879 -123.271                              

[Note that nemsinterp - and possibly metgrid, I can’t remember right now - will not permit overwriting of existing output files, so if you try to run this a second time, you need to remove some of those output files first.  I’ll fill in more detail later]

--------------------------------------------------------

Running NMM-B

Copy the boco.000? boundary conditions files and input_domain_01 to the NMMB run directory (Warning, make sure you copy the boco.000? files, not the boco.00? files.  It won’t hurt if you copy the latter, but the program needs the former)

From the NMMB directory

cp ../NMMB_init/NPS/boco.000?  .

cp ../NMMB_init/NPS/input_domain_01  .

Edit model_configure [I don’t fully understand this yet, but it appears that this file is replicated in the run directory - there is an identical file called configure_file_01 - and all I know at this point is that both are necessary for the model run.  I’m guessing one might be for the model as a whole and the other for a particular nest, but, for now I don’t know.  Just make sure they’re both there (as identical files.  I just make a symlink from one to the other)]  As of now, the file is configured correctly in the HOWTO directory you copied over.

This is small enough to run interactively, so set up 2x2 NMMB domain decomposition and 1 I/O task, then

module load mpt

module load intel

mpirun -np 5 ./NEMS.x

Note - I’ve tried running this without asynchronous I/O and it hasn’t worked.  It might be that this is just inherent in the NEMS design - an I/O component (and hence at least one task meant for that), an NMM-B component, etc.  So, it appears to me that I need to run this with cores for the NMM-B and at least one more for the I/O (like WRF, the I/O seems to be assigned to highest-numbered tasks).

There are numerous warning messages of the form:

forrtl: warning (402): fort: (1): In call to UPDATES, an array temporary was created for argument #3

Although there is a way to turn off diagnostic messages in the namelist (model_configure) they didn’t seem to shut down these messages.  Still, it timesteps and produces output.

This takes me less than 30 minutes, producing the following output files:

nmmb_hst_01_bin_0000h_00m_00.00s

nmmb_hst_01_bin_0001h_00m_00.00s

nmmb_hst_01_bin_0002h_00m_00.00s

nmmb_hst_01_bin_0003h_00m_00.00s

nmmb_hst_01_bin_0004h_00m_00.00s

nmmb_hst_01_bin_0005h_00m_00.00s

nmmb_hst_01_bin_0006h_00m_00.00s

nmmb_hst_01_nio_0000h_00m_00.00s

nmmb_hst_01_nio_0000h_00m_00.00s.ctl

nmmb_hst_01_nio_0001h_00m_00.00s

nmmb_hst_01_nio_0001h_00m_00.00s.ctl

nmmb_hst_01_nio_0002h_00m_00.00s

nmmb_hst_01_nio_0002h_00m_00.00s.ctl

nmmb_hst_01_nio_0003h_00m_00.00s

nmmb_hst_01_nio_0003h_00m_00.00s.ctl

nmmb_hst_01_nio_0004h_00m_00.00s

nmmb_hst_01_nio_0004h_00m_00.00s.ctl

nmmb_hst_01_nio_0005h_00m_00.00s

nmmb_hst_01_nio_0005h_00m_00.00s.ctl

nmmb_hst_01_nio_0006h_00m_00.00s

nmmb_hst_01_nio_0006h_00m_00.00s.ctl


Using GrADS, (on zeus, need to load the module - module load grads) we can view the nio (NEMS I/O) files.  For example, fire up GrADS (type grads), then

open nmmb_hst_01_nio_0006h_00m_00.00s.ctl

then

d t10

to display contours (in Kelvins) of the 10m temperature

For a more sophisticated plot, the GrADS script nmmbvis.gs (along with an auxiliary script, cbar.gs) has been placed in the run directory.

Run the nmmbvis.gs script as follows to produce output nmmbvis.gmf

grads   -l   -b   -c 'run nmmbvis'

Then, convert to postscript file, t10.ps

gxps    -c    -i nmmbvis.gmf    -o t10.ps

And finally convert to a viewable PNG file, t10.png

convert    -rotate 90    t10.ps    t10.png

View it with

display  t10.png

Additional comment - I’m understanding this better, but setting up a domain on a rotated_ll grid has sometimes led me to surprises.  The domain I see after plotting in GrADS isn’t necessarily what I expected.  In the WRF Preprocessing System (WPS), there is a utility, util/plotgrids.exe, which will look at the namelist file and create a crude plot of it, allowing for iterative, trial and error construction of the domain.  Although WPS supposedly allows grids to be created for the NMM core, allowing for rotated_ll coordinate system, the plotgrids utility doesn’t seem to plot it this way.  Having such a utility would be very helpful, and maybe that’s something I can look into, unless there’s already one around.