WRF on DARWIN
Currently WRF is not installed on DARWIN by the system administrators. It can be installed by users for personal use or for use by their workgroups by following the below directions. These directions will walk users through all the steps required to install and compile WRF and WPS. It will also explain how to add the WRF installation into your home or workgroup's VALET directory. Lastly, these steps will walk you through a trial run.
These directions are loosely based on the directions from NCAR's "How to compile WRF: The Complete Process" page. Our directions differ from NCAR's in order to get better performance for WRF on DARWIN. The main difference is the default MPI software WRF leverages, MPICH, whereas Open MPI is the MPI software tightly-integrated with Slurm on DARWIN. These directions will take you through the steps of installing WRF to use Open MPI instead of MPICH for parallel processing.
Installing WRF
The directions below will walk you through the download, installing, compiling WRF, and creating a WRF VALET package.
Make sure you follow the directions CAREFULLY and make the necessary changes to reflect your personal environment. The directions and examples below are based on user traine
in workgroup it_css
. You will need to change all references to reflect your username and workgroup.
Set up the installation environment
Start by joining your workgroup
[traine@login00.darwin ~]$ workgroup -g it_css [(it_css:traine)@login00.darwin ~]$
The next commands load the required VALET packages and set the necessary environment variables.
Use the VALET commmand vpkg_require
to load the appropriate version of the openmpi
package.
[(it_css:traine)@login00.darwin ~]$ vpkg_require openmpi/4.0.5:intel-2020 Adding dependency `intel/2020u4` to your environment Adding package `openmpi/4.0.5:intel-2020` to your environment
Set the environment variables needed for compiling and installing the WRF packages.
[(it_css:traine)@login00.darwin ~]$ WRF_PREFIX=$WORKDIR/sw/wrf/20210121 ## MAKE SURE THAT THIS IS CHANGED TO YOUR WORK GROUP DIRECTORY OR HOME DIRECTORY ## [(it_css:traine)@login00.darwin ~]$ mkdir -p "$WRF_PREFIX" [(it_css:traine)@login00.darwin ~]$ WRF_SRC="${WRF_PREFIX}/src" [(it_css:traine)@login00.darwin ~]$ WRF_BIN="${WRF_PREFIX}/bin" [(it_css:traine)@login00.darwin ~]$ WRF_INC="${WRF_PREFIX}/include" [(it_css:traine)@login00.darwin ~]$ WRF_LIB="${WRF_PREFIX}/lib" [(it_css:traine)@login00.darwin ~]$ WRF_LIBRARIES_SRC="${WRF_SRC}/LIBRARIES" [(it_css:traine)@login00.darwin ~]$ mkdir -p "$WRF_LIBRARIES_SRC" [(it_css:traine)@login00.darwin ~]$ WRF_TESTS_SRC="${WRF_SRC}/TESTS" [(it_css:traine)@login00.darwin ~]$ mkdir -p "$WRF_TESTS_SRC"
[(it_css:traine)@login00.darwin ~]$ export CC=icc [(it_css:traine)@login00.darwin ~]$ export CFLAGS="-xHost" [(it_css:traine)@login00.darwin ~]$ export CXX=icpc [(it_css:traine)@login00.darwin ~]$ export CXXFLAGS="-xHost" [(it_css:traine)@login00.darwin ~]$ export CPPFLAGS="-I${WRF_INC}" [(it_css:traine)@login00.darwin ~]$ export FC=ifort [(it_css:traine)@login00.darwin ~]$ export FCFLAGS="-I${WRF_INC} -xHost" [(it_css:traine)@login00.darwin ~]$ export F77="$FC" [(it_css:traine)@login00.darwin ~]$ export LDFLAGS="-L${WRF_LIB}" [(it_css:traine)@login00.darwin ~]$ export PATH="${WRF_BIN}:$PATH" [(it_css:traine)@login00.darwin ~]$ export MPICC="mpicc" [(it_css:traine)@login00.darwin ~]$ export MPICXX="mpicxx" [(it_css:traine)@login00.darwin ~]$ export MPIFC="mpifort"
Download and install required libraries
WRF requires several libraries for the installation. First, we use the environment variable WRF_LIBRARIES_SRC
to change into the LIBRARIES
directory, then we will use the wget
command to download the libraries, and finally use the appropriate commands to install each library.
Download libraries
[(it_css:traine)@login00.darwin ~]$ cd "$WRF_LIBRARIES_SRC" [(it_css:traine)@login00.darwin LIBRARIES]$ wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/netcdf-4.1.3.tar.gz [(it_css:traine)@login00.darwin LIBRARIES]$ wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/jasper-1.900.1.tar.gz [(it_css:traine)@login00.darwin LIBRARIES]$ wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/libpng-1.2.50.tar.gz [(it_css:traine)@login00.darwin LIBRARIES]$ wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/zlib-1.2.7.tar.gz
Install netcdf library
[(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf netcdf-4.1.3.tar.gz [(it_css:traine)@login00.darwin LIBRARIES]$ cd netcdf-4.1.3 [(it_css:traine)@login00.darwin netcdf-4.1.3]$ ./configure --prefix="$WRF_PREFIX" --disable-dap --disable-netcdf-4 --disable-shared [(it_css:traine)@login00.darwin netcdf-4.1.3]$ make -j 4 [(it_css:traine)@login00.darwin netcdf-4.1.3]$ make install [(it_css:traine)@login00.darwin netcdf-4.1.3]$ cd ..
Install zlib library
[(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf zlib-1.2.7.tar.gz [(it_css:traine)@login00.darwin LIBRARIES]$ cd zlib-1.2.7 [(it_css:traine)@login00.darwin zlib-1.2.7]$ ./configure --prefix="$WRF_PREFIX" [(it_css:traine)@login00.darwin zlib-1.2.7]$ make [(it_css:traine)@login00.darwin zlib-1.2.7]$ make install [(it_css:traine)@login00.darwin zlib-1.2.7]$ cd ..
Install libpng library
[(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf libpng-1.2.50.tar.gz [(it_css:traine)@login00.darwin LIBRARIES]$ cd libpng-1.2.50 [(it_css:traine)@login00.darwin libpng-1.2.50]$ ./configure --prefix="$WRF_PREFIX" [(it_css:traine)@login00.darwin libpng-1.2.50]$ make -j 4 [(it_css:traine)@login00.darwin libpng-1.2.50]$ make install [(it_css:traine)@login00.darwin libpng-1.2.50]$ cd ..
Install jasper library
[(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf jasper-1.900.1.tar.gz [(it_css:traine)@login00.darwin LIBRARIES]$ cd jasper-1.900.1 [(it_css:traine)@login00.darwin jasper-1.900.1]$ ./configure --prefix="$WRF_PREFIX" [(it_css:traine)@login00.darwin jasper-1.900.1]$ make -j 4 [(it_css:traine)@login00.darwin jasper-1.900.1]$ make install [(it_css:traine)@login00.darwin jasper-1.900.1]$ cd ..
Test library compatibility
Use the environment variable WRF_TEST_SRC
to change into the TESTS
directory. Next download the test file, compile them using the libraries you installed and verify the results. Remember all executables should typically be run on compute nodes. However these tests require compiling which must be done either on the login (head) node or by using salloc --partition=devel
to run on compute nodes, and since these tests will run quickly and only use minimal resources to verify the correct installation of the libraries, we will use the login (head) node for this demonstration.
[(it_css:traine)@login00.darwin LIBRARIES]$ cd "$WRF_TESTS_SRC" [(it_css:traine)@login00.darwin TESTS]$
Download and extract test file
[(it_css:traine)@login00.darwin TESTS]$ wget https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compile_tutorial/tar_files/Fortran_C_NETCDF_MPI_tests.tar [(it_css:traine)@login00.darwin TESTS]$ tar -xf Fortran_C_NETCDF_MPI_tests.tar
Compile and run test (serial)
[(it_css:traine)@login00.darwin TESTS]$ $FC $FCFLAGS -c 01_fortran+c+netcdf_f.f $LDFLAGS [(it_css:traine)@login00.darwin TESTS]$ $CC $CPPFLAGS -c 01_fortran+c+netcdf_c.c $CFLAGS $LDFLAGS [(it_css:traine)@login00.darwin TESTS]$ $FC -c 01_fortran+c+netcdf_f.f $FCFLAGS $LDFLAGS [(it_css:traine)@login00.darwin TESTS]$ $CC $CPPFLAGS -c 01_fortran+c+netcdf_c.c $CFLAGS $LDFLAGS [(it_css:traine)@login00.darwin TESTS]$ $FC 01_fortran+c+netcdf_f.o 01_fortran+c+netcdf_c.o $FCFLAGS $LDFLAGS -lnetcdff -lnetcdf [(it_css:traine)@login00.darwin TESTS]$ ./a.out
Compile and run test (MPI)
[(it_css:traine)@login00.darwin TESTS]$ $MPICC $CPPFLAGS -c 02_fortran+c+netcdf+mpi_c.c $CFLAGS $LDFLAGS [(it_css:traine)@login00.darwin TESTS]$ $MPIFC -c 02_fortran+c+netcdf+mpi_f.f $FCFLAGS $LDFLAGS [(it_css:traine)@login00.darwin TESTS]$ $MPICC $CPPFLAGS -c 02_fortran+c+netcdf+mpi_c.c $CFLAGS $LDFLAGS [(it_css:traine)@login00.darwin TESTS]$ $MPIFC 02_fortran+c+netcdf+mpi_f.o 02_fortran+c+netcdf+mpi_c.o $FCFLAGS $LDFLAGS -lnetcdff -lnetcdf [(it_css:traine)@login00.darwin TESTS]$ mpirun -np 2 ./a.out
Build WRF
Download WRF source code
[(it_css:traine)@login00.darwin TESTS]$ cd "$WRF_SRC" [(it_css:traine)@login00.darwin src]$ wget https://www2.mmm.ucar.edu/wrf/src/WRFV4.0.TAR.gz [(it_css:traine)@login00.darwin src]$ tar -xf WRFV4.0.TAR.gz [(it_css:traine)@login00.darwin src]$ cd WRF [(it_css:traine)@login00.darwin WRF]$
Configure WRF serial build
Configuring serial build. You will want to select option 64
and nesting option 0
.
[(it_css:traine)@login00.darwin WRF]$ export NETCDF="$WRF_PREFIX" [(it_css:traine)@login00.darwin WRF]$ ./configure [(it_css:traine)@login00.darwin WRF]$ ./compile -j 4 em_real [(it_css:traine)@login00.darwin WRF]$ mkdir -p "$WRF_BIN" [(it_css:traine)@login00.darwin WRF]$ install --target-directory="$WRF_BIN" --mode=0775 main/*.exe
Configure WRF parallel build
Configuring parallel build. You will want to select option 66
and nesting option 1
.
[(it_css:traine)@login00.darwin WRF]$ ./clean -a [(it_css:traine)@login00.darwin WRF]$ ./configure
Apply required patch for OpenMPI flags
The patch command with the use of EOT
does not display the bash prompt after it is entered, but instead >
, because it is waiting for the patch lines to be entered at the terminal until EOT
is typed. You may find it best to copy and paste the lines below to avoid making any typing mistakes as spacing is very important and it must be typed exactly done this way..
[(it_css:traine)@login00.darwin WRF]$ patch -p1 <<EOT --- A/configure.wrf 2020-12-10 14:06:01.907649095 -0500 +++ B/configure.wrf 2020-12-10 14:40:00.791338460 -0500 @@ -118,8 +118,8 @@ SFC = ifort SCC = icc CCOMP = icc -DM_FC = mpif90 -f90=\$(SFC) -DM_CC = mpicc -cc=\$(SCC) +DM_FC = mpif90 +DM_CC = mpicc FC = time \$(DM_FC) CC = \$(DM_CC) -DFSEEKO64_OK LD = \$(FC) @@ -140,7 +140,7 @@ BYTESWAPIO = -convert big_endian RECORDLENGTH = -assume byterecl FCBASEOPTS_NO_G = -ip -fp-model precise -w -ftz -align all -fno-alias \$(FORMAT_FREE) \$(BYTESWAPIO) -xHost -fp-model fast=2 -no-heap-arrays -no-prec-div -no-prec-sqrt -fno-common -maxvx2 -FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) +FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) -assume nocc_omp MODULE_SRCH_FLAG = TRADFLAG = -traditional-cpp CPP = /lib/cpp -P -nostdinc EOT
Compile WRF parallel build
[(it_css:traine)@login00.darwin WRF]$ ./compile -j 4 em_real [(it_css:traine)@login00.darwin WRF]$ mkdir -p "$WRF_BIN" [(it_css:traine)@login00.darwin WRF]$ for exe in main/*.exe; do # Install each exe to WRF_BIN with the prefix "mpi_" on it # to differentiate from the serial variants: WRF_ROOT="$(echo $exe | sed -e 's/main\///')" install --mode=0775 "$exe" "${WRF_BIN}/mpi_${WRF_ROOT}" done [(it_css:traine)@login00.darwin WRF]$
Build WPS
Download WPS source code
[(it_css:traine)@login00.darwin WRF]$ cd "$WRF_SRC" [(it_css:traine)@login00.darwin src]$ wget https://www2.mmm.ucar.edu/wrf/src/WPSV4.0.TAR.gz [(it_css:traine)@login00.darwin src]$ tar -xf WPSV4.0.TAR.gz [(it_css:traine)@login00.darwin src]$ cd WPS [(it_css:traine)@login00.darwin WPS]$
- WPS unlike WRF only has a serial build and currently does not have parallel build.
- The compile steps can take quite a long time to complete. Use caution and make sure that you select the correct options as directed below.
Configure and compile WPS serial build
Configure serial build. You will want to select option 17
.
[(it_css:traine)@login00.darwin WPS]$ export JASPERLIB="$WRF_LIB" [(it_css:traine)@login00.darwin WPS]$ export JASPERINC="$WRF_INC" [(it_css:traine)@login00.darwin WPS]$ ./configure [(it_css:traine)@login00.darwin WPS]$ ./compile [(it_css:traine)@login00.darwin WPS]$ install --target-directory="$WRF_BIN" --mode=0775 *.exe
Now select option 19
[(it_css:traine)@login00.darwin WPS]$ ./clean -a [(it_css:traine)@login00.darwin WPS]$ ./configure [(it_css:traine)@login00.darwin WPS]$ ./compile [(it_css:traine)@login00.darwin WPS]$ for exe in *.exe; do # Install each exe to WRF_BIN with the prefix "mpi_" on it # to differentiate from the serial variants: install --mode=0775 "$exe" "${WRF_BIN}/mpi_${exe}" done
Create a VALET package for WRF
If you are planning on being the only person running WRF or are installing a custom version of WRF it is suggested that you add the WRF VALET package to the $HOME/.valet
directory. If you are installing WRF and planning for multiple people in your workgroup to run it, then you should add the WRF VALET package to the $WORKDIR/sw/valet
directory. You do not have to add it in both places.
Adding WRF VALET package in HOME directory
[(it_css:traine)@login00.darwin WRF]$ mkdir -p ~/.valet #This step is only required if you don't have a .valet directory already created [(it_css:traine)@login00.darwin WRF]$ cat <<EOT > ~/.valet/wrf.vpkg_yaml # # WRF - Weather Research and Forecasting model # wrf: prefix: $(dirname "$WRF_PREFIX") description: Weather Research and Forecasting model url: "https://www.mmm.ucar.edu/weather-research-and-forecasting-model" versions: "$(basename "$WRF_PREFIX")": description: 2020-12-10 build with Intel compilers and Open MPI dependencies: - openmpi/4.0.5:intel-2020 EOT [(it_css:traine)@login00.darwin WRF]$
Adding WRF VALET package in workgroup directory
As mentioned at the top of this page these directions are based on user traine
in workgroup it_css
. In these next directions we are going to add the WRF VALET package in the it_css
workgroup directory. The location of the VALET directory might differ in your workgroup directory, or you might not yet have a VALET directory in your workgroup. If you do not have a valet directory, we suggest that you follow the directory structure we use in the it_css
workgroup. Not only will it make it easier to follow these directions, but it will also make it easy to add more software packages and versions in the future.
[(it_css:traine)@login00.darwin ~]$ cd $WORKDIR/sw [(it_css:traine)@login00.darwin sw]$ mkdir -p valet #This step is only required if you don't have a valet directory already created [(it_css:traine)@login00.darwin sw]$ cat <<EOT > valet/wrf.vpkg_yaml # # WRF - Weather Research and Forecasting model # wrf: prefix: $(dirname "$WRF_PREFIX") description: Weather Research and Forecasting model url: "https://www.mmm.ucar.edu/weather-research-and-forecasting-model" versions: "$(basename "$WRF_PREFIX")": description: 2020-12-10 build with Intel compilers and Open MPI dependencies: - openmpi/4.0.5:intel-2020 EOT [(it_css:traine)@login00.darwin ~]$
Check installation
If all the prior steps were followed and successfully completed you are now ready to check if you can successfully load WRF with VALET and then find the WRF executables.
Clear environment and load WRF VALET package
[(it_css:traine)@login00.darwin ~]$ vpkg_rollback all [(it_css:traine)@login00.darwin ~]$ vpkg_require wrf Adding dependency `intel/2020u4` to your environment Adding dependency `openmpi/4.0.5:intel-2020` to your environment Adding package `wrf/20210113` to your environment
Check WRF executables
[(it_css:traine)@login00.darwin ~]$ which real.exe /work/it_css/sw/WRF/20210113/bin/real.exe [(it_css:traine)@login00.darwin ~]$ which mpi_real.exe /work/it_css/sw/WRF/20210113/bin/mpi_real.exe
Run WRF and WPS outside SRC directories
By default WRF and WPS are setup to run in their respected SRC
directories. This is not the best idea and can cause serious problems to the WRF installation and also block the WRF software from being run by multiple people at the same time. The following steps will show you how to set up a safe environment to run WRF and WPS from a directory outside the SRC
directories.
Getting the example_wrf.tar file
By default WPS/WRF is set up to run from it's installed SRC
directory. This is not ideal, so running WPS/WRF from another directory outside it's install directory is much safer and can help protect against accidental modifications or deletions of the originally installed source files. To set up the WPS and WRF to run their respected programs outside their installed SRC
directory we have created an example to test called example_wrf.tar
available in trainf
's home directory to copy into the directory location in which you plan to download the data files and run all the WRF programs. For this demo, the example_wrf.tar
file will be copied into the location $WORKDIR/traine/wrf
.
[(it_css:traine)@login00.darwin wrf]$ cp /home/1200/wrf/example_wrf.tar .
Set up WPS
Now that you have the example_wrf.tar
file where you want it, now we can extract it and and start to set up the WPS programs.
[(it_css:traine)@login00.darwin wrf]$ tar -xvf example_wrf.tar example_wrf/ example_wrf/wrf_job/ example_wrf/wrf_job/Makefile example_wrf/wrf_job/wrfmpi.qs example_wrf/wrf_job/namelist.input example_wrf/data/ example_wrf/data/WPS_GEOG/ example_wrf/data/GFS/ example_wrf/data/GFS/modelDataDownload.sh example_wrf/data/wps_geo_download.sh example_wrf/wps_job/ example_wrf/wps_job/namelist.wps example_wrf/wps_job/Makefile
Now you need to download the Statis Geography Data and uncompress it by using the provided script wps_geo_download.sh
provided in this example. This is a fairly large set of files (~29GB) and could take some download and uncompress.
[(it_css:traine)@login00.darwin wrf]$ cd example_wrf/data [(it_css:traine)@login00.darwin data]$ ./wps_geo_download.sh ...MANY LINES AND MINUTES LATER...
If this is successful, then it is suggested that you remove the geog_high_res_mandatory.tar
file.
[(it_css:traine)@login00.darwin data]$ rm geog_high_res_mandatory.tar
Now we will download some GFS model data. You will need to update the modelDataDownload.sh
script provided in the example to the appropriate dates you want for the data available from NOAA's FTP site. The directory should be formatted something like gfs.YYYYMMDD
. In this example, we will be using gfs.20210107
.
[(it_css:traine)@login00.darwin data]$ cd GFS
In the modelDataDownload.sh
file you will want to update the yyyy
, mm
, dd
, hh
, fff
, and model
to match the date and model data that you are trying to download. This example used 01/07/2021
for the 00Z
run for the GFS model. You also need to put in your email address from
After making the necessary changes, you can run modelDataDownload.sh
script. To do this you will want to use sbatch
and the download.qs
script. The download.qs
will run the modelDataDownload.sh
script on a compute node to download all the requested model data.
[(it_css:traine)@login00.darwin GFS]$ sbatch download.qs [(it_css:traine)@login00.darwin GFS]$ ls gfs_20210107_00z_000f gfs_20210107_00z_078f gfs_20210107_00z_156f gfs_20210107_00z_234f gfs_20210107_00z_312f gfs_20210107_00z_003f gfs_20210107_00z_081f gfs_20210107_00z_159f gfs_20210107_00z_237f gfs_20210107_00z_315f gfs_20210107_00z_006f gfs_20210107_00z_084f gfs_20210107_00z_162f gfs_20210107_00z_240f gfs_20210107_00z_318f gfs_20210107_00z_009f gfs_20210107_00z_087f gfs_20210107_00z_165f gfs_20210107_00z_243f gfs_20210107_00z_321f gfs_20210107_00z_012f gfs_20210107_00z_090f gfs_20210107_00z_168f gfs_20210107_00z_246f gfs_20210107_00z_324f ... ... ... ... ... gfs_20210107_00z_072f gfs_20210107_00z_150f gfs_20210107_00z_228f gfs_20210107_00z_306f gfs_20210107_00z_384f gfs_20210107_00z_075f gfs_20210107_00z_153f gfs_20210107_00z_231f gfs_20210107_00z_309f
To prevent an issue later down the road, we will need to move the modelDataDownload.sh
script up one directory into the data
directory.
[(it_css:traine)@login00.darwin GFS]$ mv modelDataDownload.sh download.qs ../
Now we have the required metadata files and sample GFS data downloaded to run WPS and WRF.
Example WPS run
To avoid running executable on the login node, we are going to request a interactive compute node session using salloc
.
[(it_css:traine)@login00.darwin GFS] salloc --partition=it_css salloc: Granted job allocation 11715984 salloc: Waiting for resource configuration salloc: Nodes r00n49 are ready for job [triane@r00n49 GFS]$
Now we are ready to process the downloaded GFS data with the WPS scripts for our example. To do this we will need to change directories into the the wps_job
directory, then use of the make
command to gather the require files that you will need to run WPS.
[(it_css:traine)@r1n00 data]$ cd ../../wps_job/ [(it_css:traine)@r1n00 wps_job]$ make cp /work/it_css/sw/WRF/20210113/src/WPS/geogrid/GEOGRID.TBL GEOGRID.TBL cp /work/it_css/sw/WRF/20210113/src/WPS/metgrid/METGRID.TBL METGRID.TBL cp /work/it_css/sw/WRF/20210113/src/WPS/ungrib/Variable_Tables/Vtable.GFS Vtable cp /work/it_css/sw/WRF/20210113/src/WPS/link_grib.csh link_grib.csh
After that is completed you will need to update the namelist.wps
file to reflect the model data that you downloaded in the prior steps. Use nano
or vim
and make the necessary changes noted with #EDIT THIS LINE
and #ADD THIS LINE
.
&share wrf_core = 'ARW', max_dom = 1, start_date = '2021-01-07_00:00:00','2006-08-16_12:00:00', #EDIT THIS LINE end_date = '2021-01-23_23:00:00','2006-08-16_12:00:00', #EDIT THIS LINE interval_seconds = 10800 #edit this line io_form_geogrid = 2, / &geogrid parent_id = 1, 1, parent_grid_ratio = 1, 3, i_parent_start = 1, 31, j_parent_start = 1, 17, e_we = 10, 112, #EDIT THIS LINE e_sn = 10, 97, #EDIT THIS LINE ... geog_data_res = 'default','default', dx = 10000, #edit this line dy = 10000, #edit this line map_proj = 'lambert', ref_lat = 38.72, #edit this line ref_lon = -75.08, #edit this line truelat1 = 38.72, #edit this line truelat2 = 38.72, #edit this line stand_lon = -75.08, #edit this line geog_data_path = '../data/WPS_GEOG/', #EDIT THIS LINE opt_geogrid_tbl_path = './' #OPTIONAL EDIT THIS LINE ... &metgrid fg_name = 'FILE' io_form_metgrid = 2, opt_metgrid_tbl_path = './' #ADD THIS LINE /
Now you are ready to run geogrid.exe
[triane@r00n49 wps_job]$ geogrid.exe Parsed 28 entries in GEOGRID.TBL Processing domain 1 of 1 Processing XLAT and XLONG Processing MAPFAC Processing F and E Processing ROTANG Processing LANDUSEF Calculating landmask from LANDUSEF ( WATER = 17 21 ) Processing HGT_M Processing SOILTEMP Processing SOILCTOP Processing SCT_DOM Processing SOILCBOT Processing SCB_DOM Processing ALBEDO12M Processing GREENFRAC Processing LAI12M Processing SNOALB Processing CON Processing VAR Processing OA1 Processing OA2 Processing OA3 Processing OA4 Processing OL1 Processing OL2 Processing OL3 Processing OL4 Processing VAR_SSO Optional fields not processed by geogrid: LAKE_DEPTH (priority=1, resolution='default', path='../data/WPS_GEOG/lake_depth/') URB_PARAM (priority=1, resolution='default', path='../data/WPS_GEOG/NUDAPT44_1km/') FRC_URB2D (priority=1, resolution='default', path='../data/WPS_GEOG/urbfrac_nlcd2011/') IMPERV (priority=1, resolution='default', path='../data/WPS_GEOG/nlcd2011_imp_ll_9s/') CANFRA (priority=1, resolution='default', path='../data/WPS_GEOG/nlcd2011_can_ll_9s/') EROD (priority=1, resolution='default', path='../data/WPS_GEOG/erod/') CLAYFRAC (priority=1, resolution='default', path='../data/WPS_GEOG/clayfrac_5m/') SANDFRAC (priority=1, resolution='default', path='../data/WPS_GEOG/sandfrac_5m/') !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ! Successful completion of geogrid. ! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Next run the link_grib.csh
script and make sure to pass the correct path to where you are storing the downloaded GFS data.
[(it_css:traine)@r1n00 wps_job]$ ./link_grib.csh ../data/GFS/
This script will create soft links to the GFS data files into the current directory. You should see something like this
[(it_css:traine)@r1n00 wps_job]$ ls -l GRIB* lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAA -> ../data/GFS/gfs_20210107_00z_000f lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAB -> ../data/GFS/gfs_20210107_00z_003f lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAC -> ../data/GFS/gfs_20210107_00z_006f lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAD -> ../data/GFS/gfs_20210107_00z_009f lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAE -> ../data/GFS/gfs_20210107_00z_012f lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAF -> ../data/GFS/gfs_20210107_00z_015f lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAG -> ../data/GFS/gfs_20210107_00z_018f lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAH -> ../data/GFS/gfs_20210107_00z_021f ...
Next run ungrib.exe
[triane@r00n49 wps_job]$ ungrib.exe
Depending on the amount of data, this could take several minutes. If successful, then you should get a message like this at the end.
... #MANY LINES & MINUTES LATER ********** Done deleting temporary files. ********** !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ! Successful completion of ungrib. ! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
The last step is to run metgrid.exe
[(it_css:traine)@r1n00 wps_job]$ metgrid.exe ... #MANY LINES & MINUTES LATER Processing 2021-01-13_21 FILE !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ! Successful completion of metgrid. ! !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
Now you can end the salloc session and release the compute node
[(it_css:traine)@r1n00 wps_job]$ exit exit [(it_css:traine)@login00.darwin wps_job]$
Now you are ready to set up WRF.
Set up WRF
If you have not followed all the prior steps leading up to this point YOU MUST go back and complete them. The WPS steps to download and process files in them are required for the WRF steps to work. After you have completed the prior steps, then you can go into the wrf_job
directory.
[(it_css:traine)@login00.darwin wps_job]$ ../wrf_job [(it_css:traine)@login00.darwin wrf_job]$ pwd /work/it_css/sw/WRF/example_wrf/wrf_job
Now run the make
command. This will copy all the required files needed to run WRF.
[(it_css:traine)@login00.darwin wrf_job]$ make cp /work/it_css/sw/WRF/20210113/src/WRF/run/GENPARM.TBL GENPARM.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/HLC.TBL HLC.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/LANDUSE.TBL LANDUSE.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/MPTABLE.TBL MPTABLE.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/SOILPARM.TBL SOILPARM.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/URBPARM.TBL URBPARM.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/URBPARM_UZE.TBL URBPARM_UZE.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/VEGPARM.TBL VEGPARM.TBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/ozone.formatted ozone.formatted cp /work/it_css/sw/WRF/20210113/src/WRF/run/ozone_lat.formatted ozone_lat.formatted cp /work/it_css/sw/WRF/20210113/src/WRF/run/ozone_plev.formatted ozone_plev.formatted cp /work/it_css/sw/WRF/20210113/src/WRF/run/RRTM_DATA RRTM_DATA cp /work/it_css/sw/WRF/20210113/src/WRF/run/RRTM_DATA_DBL RRTM_DATA_DBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/RRTMG_LW_DATA RRTMG_LW_DATA cp /work/it_css/sw/WRF/20210113/src/WRF/run/RRTMG_LW_DATA_DBL RRTMG_LW_DATA_DBL cp /work/it_css/sw/WRF/20210113/src/WRF/run/RRTMG_SW_DATA RRTMG_SW_DATA cp /work/it_css/sw/WRF/20210113/src/WRF/run/RRTMG_SW_DATA_DBL RRTMG_SW_DATA_DBL
Next copy the met_em
files that were created in the WPS steps.
[(it_css:traine)@login00.darwin wrf_job]$ cp ../wps_job/met_em* .
Now you should be ready to run WRF.
Run WRF example
In this example, we are running the WRF real.exe
and wrf.exe
in WRF/run
. These steps should be similar to the steps required to run WRF in /WRF/test/em_real
, but there might be some differences that are not covered in these directions.
First, changes need to be made to the namelist.input
file by using nano
or vim
. These changes should reflect the information from the model data that you downloaded earlier. See all lines marked with #EDIT THIS LINE
and change accordingly.
&time_control run_days = 0, run_hours = 12, run_minutes = 0, run_seconds = 0, start_year = 2021, 2000, 2000, #EDIT THIS LINE start_month = 01, 01, 01, #EDIT THIS LINE start_day = 07, 24, 24, #EDIT THIS LINE start_hour = 00, 12, 12, #EDIT THIS LINE end_year = 2021, 2000, 2000, #EDIT THIS LINE end_month = 01, 01, 01, #EDIT THIS LINE end_day = 13, 25, 25, #EDIT THIS LINE end_hour = 21, 12, 12, #EDIT THIS LINE interval_seconds = 10800 #EDIT THIS LINE input_from_file = .true.,.true.,.true., history_interval = 180, 60, 60, frames_per_outfile = 1000, 1000, 1000, restart = .false., restart_interval = 7200, io_form_history = 2 io_form_restart = 2 io_form_input = 2 io_form_boundary = 2 / &domains time_step = 180, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, use_adaptive_time_step = .true., e_we = 10, 112, 94, #EDIT THIS LINE e_sn = 10, 97, 91, #EDIT THIS LINE e_vert = 33, 33, 33, p_top_requested = 5000, num_metgrid_levels = 34, #EDIT THIS LINE num_metgrid_soil_levels = 4, dx = 10000, 10000, 3333.33, #EDIT THIS LINE dy = 10000, 10000, 3333.33, #EDIT THIS LINE grid_id = 1, 2, 3, parent_id = 0, 1, 2, i_parent_start = 1, 31, 30, j_parent_start = 1, 17, 30, parent_grid_ratio = 1, 3, 3, parent_time_step_ratio = 1, 3, 3, feedback = 1, smooth_option = 0 /
To run WRF, we will use the wrfmpi.qs
file. This job script is based on the script template found in /opt/shared/templates/slurm/generic/mpi/openmpi/openmpi.qs
. The openmpi.qs
job script file is updated regularly, so it is suggested that you also check and compare the contents of the openmpi.qs
files to see if there has been any major changes to it. If there has been, then you might want to consider adding those changes to the wrfmpi.qs
job script. To run WRF, you will use sbatch
to submit the job to Slurm.
[(it_css:traine)@login01 wrf_job]$ sbatch wrfmpi.qs
After a couples minutes this script should finish running. Once the job has completed you can check the slurm-«JOB_ID».out
file to check if the run was successful. If it was a successful run, then the last couple lines of the file should say something similar to the lines shown below
Timing for main (dt= 80.00): time 2021-01-07_11:58:40 on domain 1: 0.00617 elapsed seconds Timing for main (dt= 80.00): time 2021-01-07_12:00:00 on domain 1: 0.00617 elapsed seconds Timing for Writing wrfout_d01_2021-01-07_12:00:00 for domain 1: 0.01661 elapsed seconds d01 2021-01-07_12:00:00 wrf: SUCCESS COMPLETE WRF Finished Running mpi_wrf.exe
References
Here is a list of references that might help you with errors that might be encountered while working with WRF.