Earth System Modeling Framework
JST Web Tutorial
January 16, 2018
These slides available at:
Q&A During Tutorial:
Topics
Hurricane Irene/NASA GOES-13 satellite image/August 26, 2011
High-performance coupled modeling infrastructure
Parallel interpolation between model grids�
Sharing models across coupled systems
Questions during tutorial?
https://tinyurl.com/esmf-questions-2018jan16
Earth System Modeling Framework
Overview
3
The Earth System Modeling Framework (ESMF) was initiated in 2002 as a multi-agency response to calls for common modeling infrastructure.
ESMF provides standard component interfaces and high-performance utilities such as grid remapping and parallel communication.
ESMF Metrics
Since 2002
There are multiple ways to use ESMF:
NUOPC Interoperability Layer
National Unified Operational Prediction Capability
4
NUOPC is a software layer on top of ESMF that provides technical interoperability of model components so they can be shared across coupled systems.
Driver
A Driver has one or more child components and is responsible for coordinating their initialization sequence and driving them through a customizable run sequence.
Driver
Connector
Connector
A Connector performs standard communication operations, in parallel, between other components, such as grid remapping and redistribution of data. Connectors have a built-in field matching algorithm based on standard names.
Model
Mediator
Model
A Model “cap” wraps a geophysical model code with standard initialization and run methods so it can be plugged into a Driver.
Mediator
A Mediator contains custom coupling code such as flux calculations, accumulation/averaging, and merging of fields among several components.
NUOPC generic components
A NUOPC component is an ESMF component with specified rules of behavior depending on the component’s role in the coupled system.
Coupled Systems using ESMF
ESMF supports a wide range of scientific coupling requirements
5
Next-generation operational prediction for weather through seasonal time scale
NEMS infrastructure is based on ESMF/NUOPC and supports multiple coupled modeling applications with different model components and different coupling configurations.
NEMS
NOAA Environmental Modeling System
Research and operational weather forecasting in support of military operations and national security
Regional and global systems are using ESMF/NUOPC interfaces.
Support for specialized coupling requirements with telescoping nested domains and nest-to-nest coupling.
COAMPS & NavGEM
Navy Regional and Global Forecasting
Data assimilation, utilization of satellite measurements, seasonal to climate forecasting, creation of reanalysis datasets
GEOS-5 features a large number of ESMF components, each handling different physics, organized into a deep hierarchy.
GEOS-5 & Model E
NASA Modeling and Data Assimilation
Research into all aspects of the climate system, including participation in the Intergovernmental Panel on Climate Change assessment reports
The NCAR climate model has used ESMF interfaces for over five years and has adopted NUOPC for increased interoperability.
CESM
Community Earth System Model
Model Components �Using ESMF/NUOPC
Earth System Prediction Suite (ESPS)
6
COUPLED MODELING SYSTEMS | ||||||
| | | | | | |
| NEMS | COAMPS and COAMPS-TC | NESPC | GEOS-5 | GISS ModelE | CESM |
Driver(s) and Coupler(s) | ● | ● | ● | ● | ● | ● |
ATMOSPHERE MODELS | ||||||
CAM |
|
|
|
|
| ● |
COAMPS atmosphere |
| ● |
|
|
|
|
GEOS-5 FV atmosphere |
|
|
| ● |
|
|
GSM | ● |
|
|
|
|
|
ModelE atmosphere |
|
|
|
| ● |
|
NavGEM |
|
| ● |
|
|
|
NEPTUNE |
| ● |
|
|
|
|
NMMB | ● |
|
|
|
|
|
OCEAN MODELS | ||||||
HYCOM | ● |
| ● |
|
| ● |
MOM | ● |
|
| ● |
|
|
NCOM |
| ● |
|
|
|
|
POP |
|
|
|
|
| ● |
(Additional rows not shown) | | | | | | |
Complete table available at: https://www.earthsystemcog.org/projects/esps/
Directory of NUOPC-compliant components
The ESPS is a collection of federal and community weather and climate model components that use ESMF with interoperability conventions called the National Unified Operational Prediction Capability (NUOPC) Layer.
Model components are more easily shared across systems
The standard component interfaces enable major modeling centers to assemble systems with components from different organizations, and test a variety of components more easily.
● NUOPC-compliant
● In progress
Component Interfaces
Standard subroutine signatures promote interoperability
7
subroutine InitModel(comp, importState, exportState, clock, rc)
type(ESMF_GridComp) :: comp ! pointer to component
type(ESMF_State) :: importState ! incoming coupling fields
type(ESMF_State) :: exportState ! outgoing coupling fields
type(ESMF_Clock) :: clock ! model timekeeper
integer, intent(out) :: rc ! return code
! This is where the model specific setup code goes
rc = ESMF_SUCCESS
end subroutine InitModel
The standard subroutine signature of an ESMF method
Three method types
Initialize, run, and finalize methods are “wrappers” around model code. They have the same parameter lists.
Retain native infrastructure
ESMF is designed to coexist with native infrastructure. Extensive code refactoring is not required.
Minimal overhead
A 2010 performance study of NCAR’s Community Climate System Model 4 (CCSM4) showed ESMF component overhead around 3% or less.
A new report measuring NUOPC overhead is available.
ESMF Performance
General philosophy to support high performance coupled systems
Do-no-harm
Scale
Preserve locality
Components on disjoint processor sets:
Components with the same grid and decomposition:
Performance sudies available at: https://www.earthsystemcog.org/projects/esmf/performance
Parallel Programming Model
All ESMF data types are designed for high-performance computing
9
ESMF Application
Component resources
The Virtual Machine (VM) class manages a component’s computational resources.
The basic elements contained in a VM are Persistent Execution Threads (PETs). A PET typically maps to an MPI process (task), but may also map to a Pthread.
Flexible mapping to resources
PETs can be assigned to components flexibly, allowing concurrent or sequential execution. The ESMF VM has recently been extended to recognize heterogeneous resources (e.g. CPU and GPU devices)
VM
PET 0
PET 1
PET 2
PET 3
MPI Task 0
MPI Task 1
MPI Task 2
MPI Task 3
OS level
An ESMF component running on four PETs. In this example, each PET is mapped to an MPI task.
ESMF
Component
Import and Export States
How components share data
10
Components share data via import and export states
A state is a container for ESMF data types that wrap native model data. Model data can be referenced by a pointer, avoiding copying operations.
Metadata travels with coupling fields
Includes physical field name, underlying grid structure and coordinates, and parallel decomposition
ESMF handles data transfer
There are multiple communication options available for moving data from one component’s export state to another’s import state.
Component A
Import State
Export State
coupling fields
Component B
Import State
Export State
Parallel Communication Operations
Inter-model and intra-model communication
11
Sparse Matrix Multiply
Redistribution
Scatter/Gather
Halo
Regrid
Getting Help with ESMF
Documentation, code samples, and other resources
12
12
All resources linked here - this is the best place to start
https://www.earthsystemcog.org/projects/esmf/
Comprehensive API documentation and description of each class
Installation/testing procedure and architectural overview
Example applications for learning ESMF. Includes basic demo codes and examples of using command line regridding.
Installing ESMF
How to download, configure, and build
13
Getting ESMF
Visit the ESMF downloads page for the latest public and internal releases as well as previous releases.
Installation
Installation instructions are available in the User’s Guide.
�Requirements:
Optional:
LAPACK, NetCDF, parallel-NetCDF, Xerces�
# set up environment
$ export ESMF_DIR=/home/user/esmf�$ export ESMF_COMM=openmpi�$ export ESMF_COMPILER=intel�$ export ESMF_INSTALL_PREFIX=/path/to/install�…��# build and install�$ make –j8�$ make check�$ make install�
Typical installation procedure
ESMF Support List
General questions, technical support, and feature requests
14
14
esmf_support@list.woc.noaa.gov
Topics
Hurricane Irene/NASA GOES-13 satellite image/August 26, 2011
High-performance coupled modeling infrastructure
Parallel interpolation between model grids�
Sharing models across coupled systems
Questions during tutorial?
https://tinyurl.com/esmf-questions-2018jan16
Supported Geometries
Grids, meshes, and observational data streams
16
Grid
A structured representation of a region, such as a logically rectangular tile or set of tiles
Mesh
An unstructured representation of a region including 2D polygons with any number of sides and 3D tetrahedra and hexahedra
LocStream
A set of disconnected points such as locations of observations
| | | |
| | | |
| | | |
17
ESMF Regridding
Fast, flexible interpolation of gridded data
High-performance
Interpolation weight matrix is generated in parallel in 3D space and applied in parallel
Wide range of supported grids
Logically rectangular and unstructured grids in 2D and 3D, observational data streams (point cloud), global and regional grids, Cartesian and spherical coordinates
Multiple interpolation methods
Bilinear, higher-order patch recovery, nearest neighbor, first order conservative, second order conservative available in next release
Options
Masking, multiple pole treatments, straight or great circle distance measure
Multiple interfaces
Generation and Application of Regridding Weights
Fortran API for online regridding operation
18
! create source and destination grids and fields
srcGrid = ESMF_GridCreate(...)
dstGrid = ESMF_GridCreate(...)�srcField = ESMF_FieldCreate(srcGrid,...)
dstField = ESMF_FieldCreate(dstGrid,...)��! compute regrid weight matrix �call ESMF_FieldRegridStore(srcField, dstField, routehandle, ...)��! loop over time�do t=1,...
� ! compute new srcField
! apply regrid weight matrix� call ESMF_FieldRegrid(srcField, dstField, routehandle, ...)�enddo�
! release resources�call ESMF_FieldRegridRelease(routehandle, ...)
Regrid operation computed in two phases
The first phase computes an interpolation weight matrix which is efficiently stored in an ESMF_RouteHandle.
The weights only need to be computed once.
The second phase applies the weight matrix to a source field resulting in a destination field.
This same pattern is used for other operations such as redistribution and halo.
Typical code pattern for executing an ESMF communications operations. Once computed, a RouteHandle can be reused for multiple calls.
Regridding in Python with ESMPy
ESMPy is a Python interface to ESMF functionality
19
A Python API to ESMF regridding and related classes
Transforms data from one grid to another by generating and applying remapping weights.
Supports structured and unstructured, global and regional, 2D and 3D grids, created from file or in memory, with many options.
Fully parallel and highly scalable.
Visit the ESMPy home page for user documentation and installation instructions.
# create a uniform global latlon grid from a SCRIP formatted file�grid = ESMF.Grid(filename=grid1, filetype=ESMF.FileFormat.SCRIP,� add_corner_stagger=True)��# create a field on the center stagger locations of the source grid�srcfield = ESMF.Field(grid, name='srcfield', staggerloc=ESMF.StaggerLoc.CENTER)��# create an ESMF formatted unstructured mesh with clockwise cells removed�mesh = ESMF.Mesh(filename=grid2, filetype=ESMF.FileFormat.ESMFMESH)��# create a field on the nodes of the destination mesh�dstfield = ESMF.Field(mesh, name='dstfield', meshloc=ESMF.MeshLoc.ELEMENT)��# initialize the fields ...��# create an object to regrid data from the source to the destination field�regrid = ESMF.Regrid(srcfield, dstfield,� regrid_method=ESMF.RegridMethod.CONSERVE,� unmapped_action=ESMF.UnmappedAction.IGNORE)
�# perform the regridding from source to destination field�dstfield = regrid(srcfield, dstfield)�
File-based Regridding
ESMF utilities that run from the command line
20
ESMF_RegridWeightGen
Reads in two NetCDF grid files and outputs a NetCDF file containing interpolation weights.
Input formats: SCRIP, ESMFMESH, UGRID, GRIDSPEC
Output format: SCRIP weight file
Arguments for interp. method, pole treatment, regional grids, ignore unmapped points and degenerate cells.
$ mpirun -np 4 ./ESMF_RegridWeightGen
-s src.nc \ # source grid file
-d dst.nc \ # destination grid file
-m conserve \ # interpolation method
-w w.nc # output weight file
ESMF_Regrid
Generates and applies interpolation weights from a source to destination grid file.
Formats: GRIDSPEC for structured grids and UGRID for unstructured. Currently limited to 2D grids.
Arguments for interp. method, pole treatment, regional grids, ignore unmapped points and degenerate cells.
$ mpirun -np 4 ./ESMF_Regrid \
-s simple_ugrid.nc \ # source file
-d simple_gridspec.nc \ # dest. file� --src_var zeta \ # source variable
--dst_var zeta # dest. variable
Non-conservative Regrid Methods
ESMF supports multiple interpolation algorithms
21
Bilinear
Higher-order patch recovery [1][2]
Nearest neighbor
Conservative Regrid Methods
ESMF supports multiple interpolation algorithms
22
First-order conservative
Second-order conservative (coming in 7.1 release)
Conservative Methods Example
Comparison of first- and second-order conservative regridding
Source grid:
10 degree uniform global
Destination grids:
2 degree uniform global
First-Order Conservative
Second-Order Conservative
Analytic field:
F = 2+cos(lon)^2 * cos(2*lat)
Regridding Performance
Strong scaling of different regrid methods
24
Source: cubed sphere grid (~25 million cells)
Destination: uniform latitude longitude grid �(~17 million cells)
Platform: IBM iDataPlex cluster (Yellowstone at NCAR)
ESMF Spatial Classes
Representing parallel decompositions of model grids and fields
25
Physical field types
Classes for wrapping your model’s physical variables
ESMF_Grid
represents logically rectangular regions
ESMF_Mesh
represents unstructured grids
ESMF_LocStream
represents sets of observational points
ESMF_Field
represents a physical model field
ESMF_FieldBundle
a collection of fields on the same grid
ESMF_XGrid
represents exchange grid at planetary boundary layer
ESMF_DistGrid
distributed, multi-dimensional index space
ESMF_DELayout
maps decomposition elements to processes (PETs)
ESMF_Array
index-based distributed
data storage
Geometric types
Classes for representing discretized domain geometries
Index-space types
Classes for representing parallel decompositions of data arrays
Grid Creation Fortran APIs
Logically rectangular grids
26
Grid Specification
ESMF_GridCreateNoPeriDim()
No edge connections, e.g., a regional grid with closed boundaries
ESMF_GridCreate1PeriDim()
One periodic dimension and options for the poles (bipole and tripole spheres)
ESMF_GridCreate2PeriDim()
Two periodic dimensions, e.g., a torus, or a regional grid with doubly periodic boundaries
ESMF_GridCreate()
Several general APIs give control over topology and decomposition parameters
See Reference Manual for grid API details
Cubed Sphere Support
Options for representing cubed spheres in ESMF
27
There are two ways cubed spheres are supported in ESMF: �
Both representations can be regridded to other ESMF geometry types (i.e. Grids, Meshes, and Location Streams).
Three new APIs to allow easier creation of cubed spheres in ESMF:
Grid Parallel Decomposition
Flexible specification of parallel memory layouts
28
DE 3
DE 0
DE 4
DE 1
DE 5
DE 2
type(ESMF_Grid) :: my2DGrid
my2DGrid = &
ESMF_GridCreateNoPeriDim(minIndex=(/1,1/), &
maxIndex=(/18,12/), regDecomp=(/2,3/), rc=rc)
DE 6
DE 3
DE 0
DE 7
DE 4
DE 1
DE 8
DE 5
DE 2
Regular Decomposition
type(ESMF_Grid) :: my2DGrid��my2DGrid = ESMF_GridCreateNoPeriDim( &� countsPerDEDim1=(/3,5,10/), & � countsPerDEDim2=(/2,7,3/), rc=rc)���
Domains divided into decomposition elements (DEs).
Irregular Decomposition
Arbitrary lists of deBlocks can also be provided to match native data decompositions.
Distributing DEs to PETs
Decomposition elements are mapped to compute resources
29
DE 3
DE 0
DE 4
DE 1
DE 5
DE 2
One-to-one mapping
Typically, the number of DEs will match the number of PETs available to a component. This leads to mapping of one DE per PET.
PET 0
DE 0
Multiple DEs per PET allows threading
Arbitrary assignment of DEs to PETs is possible, e.g., if there are more DEs than PETs.
The same set of decomposition elements can be mapped to different PET layouts. Arbitrary layouts are possible using the ESMF_DELayout class.
PET 1
DE 1
PET 2
DE 2
PET 3
DE 3
PET 4
DE 4
PET 5
DE 5
PET 0
DE 0
PET 1
DE 1
PET 2
DE 2
PET 3
DE 3
DE 4
DE 5
Create a Field on a Grid
Fields represent discretized model variables
30
! create a grid�grid = ESMF_GridCreateNoPeriDim(minIndex=(/1,1/), &� maxIndex=(/10,20/), &� regDecomp=(/2,2/), name="atmgrid", rc=rc)�
! create a field from the grid and typekind
! this allocates memory for you�field1 = ESMF_FieldCreate(grid, typekind=ESMF_TYPEKIND_R4, &� indexflag=ESMF_INDEX_DELOCAL, &� staggerloc=ESMF_STAGGERLOC_CENTER, &� name="pressure", rc=rc)�
�! get local bounds, assuming one local DE�call ESMF_FieldGet(field1, localDe=0, farrayPtr=farray2d, &� computationalLBound=clb, computationalUBound=cub, &� totalCount=ftc)��do i = clb(1), cub(1)� do j = clb(2), cub(2)� farray2d(i,j) = … ! computation over local DE� enddo�enddo�
An ESMF_Field wraps model variables
Fields are added to import and export states and can be transferred between components.
Many options
Code that creates an ESMF_Field on center stagger with local indexing. Memory is allocated by ESMF. The local bounds are retrieved.
Mesh Creation Fortran API
ESMF representation of unstructured grids
31
type(ESMF_Mesh) :: mesh
mesh = ESMF_MeshCreate(parametricDim=2, spatialDim=2, &� nodeIds=nodeIds, & ! 1d array of unique node ids� nodeCoords=nodeCoords, & ! 1d array of size spatialDim*nodeCount� nodeOwners=nodeOwners, & ! 1d array of PETs � elementIds=elemIds,& ! 1d array of unique element ids� elementTypes=elemTypes, & ! 1d array of element types� elementConn=elemConn) ! 1d array of corner node local indices�
Supported geometries
Explicit creation of a mesh from lists of nodes, elements, and their connectivity.
The call is collective across PETs and each PET provides only local nodes/elements.
Parallel distribution by element
Nodes may be duplicated on multiple PETs but are owned by one PET.
Optional arguments
Grid and Mesh File Formats
ESMF supports several NetCDF metadata conventions
32
File Format | Description | ESMF_Grid | ESMF_Mesh |
ESMF_FILEFORMAT_SCRIP |
| | |
ESMF_FILEFORMAT_GRIDSPEC |
| | |
ESMF_FILEFORMAT_UGRID |
| | |
ESMF_FILEFORMAT_ESMFMESH |
| | |
Regrid Supported Options
Combinations of grids/meshes supported for regridding
33
All regrid methods supported between any pair of 2D grids and meshes
Bilinear and nearest neighbor supported between any pair of:
Conservative supported between any pair of:
ESMF_LocStream can be destination for bilinear, patch and nearest neighbor methods
Details of supported regrid combinations are available for each ESMF release.
Regrid Features in Upcoming Release (7.1.0)
34
Topics
Hurricane Irene/NASA GOES-13 satellite image/August 26, 2011
High-performance coupled modeling infrastructure
Parallel interpolation between model grids�
Sharing models across coupled systems
Questions during tutorial?
https://tinyurl.com/esmf-questions-2018jan16
Component Reuse across Coupled Systems
NUOPC provides a standard coupling interface for component interoperability across systems
36
NUOPC Layer
CESM
Navy global & regional
NASA ModelE & GEOS-5
Custom Coupling by Centers
...
NEMS
Applications
ATM
WAV
ATM
ATM
LND
ATM
HYD
ION
CST
ATM
ATM
ATM
ATM
ATM
OCN
ATM
ICE
ESMF
Provides generic utilities and data structures
NUOPC-Compliant Components
Each component has a standard interface so that it can technically connect to any NUOPC-based coupled system
Custom Coupling
Each coupled system includes a set of components and specific technical and scientific choices; includes custom drivers and mediators
NUOPC Layer
Provides generic components and technical rules to enable sharing of components across coupled systems
NUOPC-Compliant Components
NUOPC Interoperability Layer
National Unified Operational Prediction Capability
37
NUOPC is a software layer on top of ESMF that provides technical interoperability of model components so they can be shared across coupled systems.
Driver
A Driver has one or more child components and is responsible for coordinating their initialization sequence and driving them through a customizable run sequence.
Driver
Connector
Connector
A Connector performs standard communication operations, in parallel, between other components, such as grid remapping and redistribution of data. Connectors have a built-in field matching algorithm based on standard names.
Model
Mediator
Model
A Model “cap” wraps a geophysical model code with standard initialization and run methods so it can be plugged into a Driver.
Mediator
A Mediator contains custom coupling code such as flux calculations, accumulation/averaging, and merging of fields among several components.
NUOPC generic components
A NUOPC component is an ESMF component with specified rules of behavior depending on the component’s role in the coupled system.
Where do I go to change X?
Understand the role of each kind of component in a NUOPC application
38
Driver:
COUPLED WAVE
Model:
ATM
Mediator
Model:
ICE
Model:
OCN
Model:
WAVE
Driver (blue)
Model “cap” (yellow)
Connector (green)
Mediator (orange)
Component architecture of an example NUOPC-based application
Flexible Run Sequence Syntax
Simplify mapping of components to resources and execution ordering
# Run Sequence # runSeq:: @1800.0 MED MedPhase_prep_ocn MED -> OCN :remapMethod=redist OCN @600.0 MED MedPhase_prep_ice MED MedPhase_prep_atm MED -> ATM :remapMethod=redist MED -> ICE :remapMethod=redist ATM ICE ATM -> MED :remapMethod=redist ICE -> MED :remapMethod=redist MED MedPhase_atm_ocn_flux MED MedPhase_accum_fast @ OCN -> MED :remapMethod=redist @ :: |
#####################################�# Run Time Configuration File # ##################################### # EARTH # EARTH_component_list: MED ATM OCN ICE # MED #�med_model: nems med_petlist_bounds: 60 65
#ATM# atm_model: fv3 atm_petlist_bounds: 0 31 # OCN # ocn_model: mom5 ocn_petlist_bounds: 32 55
# ICE # ice_model: cice ice_petlist_bounds: 56 59 |
Colors show actions performed by:
(@) indicates coupling interval
Driver:
SEASONAL
Model:
ATM
Mediator
Model:
ICE
Model: OCN
As implemented in the NOAA Environmental Modeling System (NEMS)
Options for Explicit and Implicit Coupling
NUOPC Drivers have a dynamic, customizable run sequence
40
40
Different predictive timescales or problems require different types of coupling schemes.
Simple explicit coupling loop
All components exchange data at the same coupling interval.
A lagged scheme
Some components may run ahead or lag behind to satisfy numerical constraints.
Multiple timescales
Some components execute and communicate less frequently.
Driver: SIMPLE LAGGED COUPLING
Component Concurrency
NUOPC components can run sequentially or concurrently
41
41
Concurrent execution
ATM and OCN components are assigned mutually exclusive sets of PETs allowing concurrent computation.
Sequential execution
ATM and OCN have overlapping PETs forcing sequential execution of components.
Interoperability of NUOPC Components
NUOPC components adhere to a set of technical compliance rules
42
42
Single public entry point
All NUOPC components have a single public subroutine called SetServices.
Model discretizations and fields are represented by ESMF data types
Domain discretizations are represented by one of ESMF’s geometric data types. Fields are represented as ESMF_Fields with standard metadata.
Standard names for coupling fields
The NUOPC Field Dictionary is a mechanism for ensuring that model physical fields have a recognized standard name. An initial set of field names is based on the CF conventions.
Standard initialization sequence
All components participate in an initialization sequence. The purpose is to:
Standard run sequence
The run sequence ensures that model clocks remain synchronized and that incoming coupling fields are timestamped at the expected model time.
Makefile fragment with build dependencies
Each component provides a makefile fragment with a small number of variables used for compiling and linking against the component in an external system.
NUOPC “Caps”
Translation layer
43
43
Model Input Arrays
Model Output Arrays
Model
Clock
Model Grid/Mesh
Model execution subroutines:
Model_Init(), Model_Run(), Model_Finalize()
Import State
sea_surface_temperature
ocn_current_zonal
ocn_current_merid
Export State
sea_ice_temperature
Clock
start, current, stop, timestep
NUOPC execution phases and specialization points:
AdvertiseFields(), RealizeFields(), ModelAdvance()
ESMF Grid/Mesh
NUOPC Infrastructure (Driver, Mediator, Connector)
Physical Model Fortran code
NUOPC “Cap”
Subroutines in a “Cap”
Specializations are hooks for user code
44
44
A “cap” is implemented as a Fortran module with several subroutines containing user code
The subroutines are registered in SetServices based on pre-defined labels.
A “cap” specializes the generic NUOPC Model component with the details of a particular model
A specialization either provides an implementation not provided by NUOPC or overrides (replaces) a default behavior.
Specialization subroutines in a typical “cap”
Advertise fields (IPDv03p1)
Provide a list of import and export fields by standard name
Realize fields (IPDv03p3)
Provide a grid definition and create import and export fields
label_SetClock
Modify requested clock, e.g., to change timestep length
label_DataInitialize
Initialize fields in export state
label_Advance
Take a timestep�
Before Building a “Cap”
Steps to making a model NUOPC compliant
45
45
Native initialize, run, and finalize subroutines
Basic control structures need to be in place to allow for top-down execution. The model run should return control after a given interval.
Independent build as a library
The model code should ideally be an independent codebase that can be compiled into its own library. This facilitates integration into external systems.
Typically the “cap” code is included with the rest of the model source as a driver/coupling option.
MPI considerations
NUOPC components execute on a subset of tasks from the global MPI communicator. References to MPI_WORLD_COMM need to be replaced with a communicator that will be passed to the component.
Regression testing
Develop and debug the cap with low-resolution cases. Have baseline results available for comparison. Bit-for-bit reproducibility against standalone model runs is typically possible.
Getting Help with NUOPC
Documentation, code samples, and other resources
46
46
Building a NUOPC Model how-to document
This document describes steps involved in creating a NUOPC “cap” and includes example code.
Detailed description of NUOPC design and public APIs.
Los Alamos Sea Ice NUOPC cap documentation
Look at an existing cap code and its documentation to guide creation of a new cap.
NUOPC Website and Prototype Codes
Each prototype is a small, skeleton application that demonstrates how the four kinds of NUOPC components can be assembled into different architectures.
NUOPC Tools
Tools help with writing code, runtime analysis, and debugging
47
47
Compliance Checker (internal)
Activated by an environment variable, the Compliance Checker intercepts all NUOPC phases and writes out extensive compliance diagnostics to the ESMF log files. Any compliance issues found are flagged as warnings.
Component Explorer (command line)
A generic Driver that links to a single NUOPC component and outputs information to standard out such as registered phases and import/export fields.
Cupid (GUI)
Cupid is an Eclipse-based development environment with special tooling for building and analyzing NUOPC applications.
https://www.earthsystemcog.org/projects/cupid/
Cupid: Profile for Load Balance
Built-in and user-defined timer regions
48
48
Trace of Navy’s COAMPS with coupled atmosphere-land-hydrology.
Cupid: Visualize Call Stack
NUOPC Call Stack View assists with post-run debugging and performance analysis
49
49
Trace of coupled NMMB-HYCOM application in the NOAA Environmental Modeling System (NEMS)
Cupid: Generate NUOPC Compliant Code
Generation of “cap” templates and skeleton applications
50
50
NUOPC “cap” code on the left with an outline of the “cap” code structure on the right.
Missing subroutine are indicated in red and templates available for generation are in grey.
Thank You!
51
51
Questions welcome!
esmf_support@list.woc.noaa.gov
We would appreciate your feedback on this tutorial!
http://tinyurl.com/esmf-tutorial-eval
References
52
52
[1] Khoei S.A., Gharehbaghi A. R. The superconvergent patch recovery technique and data transfer operators in 3d plasticity problems. Finite Elements in Analysis and Design, 43(8), 2007.�
[2] Hung K.C, Gu H., Zong Z. A modified superconvergent patch recovery method and its application to large deformation problems. Finite Elements in Analysis and Design, 40(5-6), 2004.�
Extra Slides
Look at a “Cap” Subroutine
Hook for user code that advertises coupling fields
54
subroutine InitializeP1(model, importState, exportState, clock, rc)� type(ESMF_GridComp) :: model� type(ESMF_State) :: importState, exportState� type(ESMF_Clock) :: clock� integer, intent(out) :: rc� � rc = ESMF_SUCCESS� � ! importable field: sea_surface_temperature� call NUOPC_Advertise(importState, &� StandardName="sea_surface_temperature", name="sst", & � TransferOfferGeomObject="will provide", rc=rc)� if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &� line=__LINE__, file=__FILE__)) &� return ! bail out� � ! exportable field: surface_net_downward_shortwave_flux� call NUOPC_Advertise(exportState, &� StandardName="surface_net_downward_shortwave_flux", name="rsns", & � TransferOfferGeomObject="will provide", rc=rc)� if (ESMF_LogFoundError(rcToCheck=rc, msg=ESMF_LOGERR_PASSTHRU, &� line=__LINE__, file=__FILE__)) &� return ! bail out��end subroutine
A cap subroutine responsible for “advertising” coupling fields
There are two calls to NUOPC_Advertise. The first one advertises an import field, “sea_surface_temperature,” and the second an export field.
Flexible Configurations of NUOPC Components
Drivers, Models, Connectors and Mediators
55
55
Driver:
SIMPLE
Model:
ATM
Model:
OCN
Coupled system with a Driver, two Model components, and two Connectors
This configuration creates a coupled system that allows a two-way feedback loop between ATM and OCN.
A Driver with four Models and a Mediator
The OCN and WAVE components communicate directly while other components receive data only after processing by the Mediator.
The OCN component is hierarchical with an embedded driver for components representing subprocesses.
Driver:
COUPLED WAVE
Model:
ATM
Mediator
Model:
ICE
Model:
OCN
Model:
WAVE
Cupid Development Environment
A NUOPC plugin for the Eclipse Integrated Development Environment
56
56
Understand coupled system state over time
Enhanced search and filtering of ESMF log files
Visualizations assist in application debugging
Automated code generation and compliance checking
Import model source code
A screenshot of Cupid tools inside the Eclipse IDE.