Quickstart Guide: FMS atmospheric dynamical cores

Amy Langenhorst

Abstract

This document describes how to acquire, compile, and run specific test cases of four solo FMS atmospheric dynamical core models. The available codes are a finite-difference B-grid model and spectral model running the Held-Suarez GCM benchmark, and two simple spectral variations, a barotropic model and shallow water model.

For more information, see the Jakarta Atmospheric Dynamical Core User's Guide which is included in this package and also accessible from the FMS Homepage.


Table of Contents

1. Acquire the Source Code
2. Run the Model
2.1. The Provided Sample Runscripts
2.2. Functionality of the Sample Runscripts
2.3. Portability Issues with the Sample Runscripts
2.4. Changing the Sample Runscripts
3. Examine the Output

1. Acquire the Source Code

The Flexible Modeling System development team at GFDL uses a local implementation of GForge to serve FMS software, located at http://fms.gfdl.noaa.gov. In order to obtain the source code, you must register as an FMS user on our software server. After submitting the registration form on the software server, you should receive an automatically generated confirmation email within a few minutes. Clicking on the link in the email confirms the creation of your account.

After your account has been created, you should log in and request access to the FMS Atmospheric Dynamical Cores project. Once the FMS project administrator grants you access, you will receive a second e-mail notification. This email requires action on the part of the project administrator and thus may take longer to arrive. The email will contain a software access password along with instructions for obtaining the release package, which are described below.

To check out the release package containing source code, scripts, and documentation via CVS, type the following commands into a shell window. You might wish to first create a directory called fms in which to run these commands. You should enter the software access password when prompted by the cvs login command. At cvs login, the file ~/.cvspass is read. If this file does not already exist, an error message may display and the cvs login may fail. In this event, you should first create this file via touch ~/.cvspass.

cvs -z3 -d:pserver:cvs@fms.gfdl.noaa.gov:/cvsroot/atm-dycores login
cvs -z3 -d:pserver:cvs@fms.gfdl.noaa.gov:/cvsroot/atm-dycores co -r jakarta atm_dycores

This will create a directory called atm_dycores in your current working directory containing the release package. The readme file in the atm_dycores directory gives a brief overview of the package's directory structure and contents.

If you prefer not to use CVS, you may download the tar file called atm_dycores.tar.gz from https://fms.gfdl.noaa.gov/projects/atm-dycores/. Sample output is also available there for download. See Section 3 for more information on the sample output.

2. Run the Model

2.1. The Provided Sample Runscripts

This release includes four run scripts in the directory atm_dycores/scripts. Each runs one of four atmospheric dynamical core models, which are:

2.2. Functionality of the Sample Runscripts

These runscripts perform the minimum required steps to run the models and are intended only as a starting point for the development of more practical run scripts. The scripts should be executed from the atm_dycores/scripts directory. Each of these sample scripts:

  • compiles the mppnccombine executable for multiprocessing platforms,

  • compiles and links the model source code,

  • creates a working directory where the model will be run,

  • creates or copies the required input data into the working directory, and

  • runs the model.

Note that the directory paths and file paths are variables. They are initially set to correspond to the directory structure as it exists upon CVS checkout, but are made variables to accommodate changes to this directory structure.

The diagnostic fields output from the models is mulithreated. That is, each processor writes a separate file which includes data only from its own portion of the globe (its "domain"). A utility named mppnccombine is supplied which is executed after the model execution is complete and which combines these into a single file which covers the entire globe. For a complete description of mppnccombine see the mppnccombine documentation in the user's guide.

The output is not moved from the working directory, archiving of output is left to the user. The files needed to restart the model are left in the working directory's subdirectory called RESTART. If it is desired to restart the model from this state, do the following:

  1. Move the files in $workdir/RESTART to $workdir/INPUT.
  2. The mppnccombine utility will not overwrite preexisting diagnostic field files in $workdir, so they must be moved or renamed before restarting.
  3. Comment the if ( -e $workdir ) block in the runscript, which prevents accidental reuse of the working directory.
  4. You can then execute the runscript again.

2.3. Portability Issues with the Sample Runscripts

If you encounter a compile error when executing the sample runscript, please first check whether you have correctly customized your mkmf template. The scripts use the mkmf utility, which creates make files to facilitate compilation. The mkmf utility uses a platform-specific template for setting up system and platform dependent parameters. Sample templates for various platforms are provided in the atm_dycores/bin directory. You may need to consult your system administrator to set up a compilation template for your platform and ensure the locations for system libraries are defined correctly. For a complete description of mkmf see the mkmf documentation. The $platform variable in the runscript is used to separate and identify platform-specific items in the runscript, including the mkmf template.

The execution is accomplished with a utility called mpirun, which is unique to machines by Silicon Graphics. This may need to be changed to run on other platforms.

2.4. Changing the Sample Runscripts

2.4.1. Changing the length of the run and atmospheric time step

By default the scripts are set up to run only one or two days. The run length is controlled by the namelist main_nml which is set directly in the runscripts for convenience. To increase the run length to 200 days, change the namelist parameter days in the runscript as follows. The other parameter in the namelist, dt_atmos, controls the atmospheric time step.

 &main_nml
     days   = 200,
     dt_atmos = 1800 /

2.4.2. Changing the number of processors

By default the scripts are set up to run with the MPI library, but only on one processor. To increase the number of processors, change the $npes variable at the top of the sample runscript. You may need to consult the documentation for each particular model concerning appropriate processor counts for that model.

To run without the MPI library, do the following:

  1. Make sure you are only using one processor, ie, the variable $npes is set to 1 at the top of the sample runscript.
  2. Change the run command in the runscript from "mpirun -np $npes fms.x" to simply "fms.x".
  3. Remove the -Duse_libMPI from the mkmf line in the runscript.
  4. Remove the -lmpi from the $LIBS variable in your mkmf template.
  5. Move or remove your previous compilation directory (specified as $execdir in the runscript) so that all code must be recompiled.

3. Examine the Output

You may download sample output data for comparison at https://fms.gfdl.noaa.gov/projects/atm-dycores/. Each tar file expands to a directory containing a readme file along with netcdf and ascii output. The files bgrid_output.tar.gz and spectral_output.tar.gz contain daily snapshots of surface pressure through the 200 day spinup period and time means of all fields over the 200 to 1200 day period. The files barotropic_output.tar.gz and shallow_output.tar.gz contain thirty days of diagnostic output for the spectral barotropic model and spectral shallow water model respectively.