software:wrf:darwin

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
software:wrf:darwin [2021-02-04 14:58] – [Getting the example_wrf.tar file] mkylesoftware:wrf:darwin [2024-04-02 14:39] (current) – [Set up WPS] anita
Line 4: Line 4:
 These directions are loosely based on the directions from NCAR's [[https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compilation_tutorial.php|"How to compile WRF: The Complete Process"]] page. Our directions differ from NCAR's in order to get better performance for WRF on DARWIN. The main difference is the default MPI software WRF leverages, MPICH, whereas Open MPI is the MPI software tightly-integrated with Slurm on DARWIN. These directions will take you through the steps of installing WRF to use Open MPI instead of MPICH for parallel processing.  These directions are loosely based on the directions from NCAR's [[https://www2.mmm.ucar.edu/wrf/OnLineTutorial/compilation_tutorial.php|"How to compile WRF: The Complete Process"]] page. Our directions differ from NCAR's in order to get better performance for WRF on DARWIN. The main difference is the default MPI software WRF leverages, MPICH, whereas Open MPI is the MPI software tightly-integrated with Slurm on DARWIN. These directions will take you through the steps of installing WRF to use Open MPI instead of MPICH for parallel processing. 
  
-<note warning> 
-During DARWIN Early Access Use ''/lustre/unsponsored/users/<<uid>>/sw/wrf''  and ''/lustre/unsponsored/users/<<uid>>/sw/valet'' to install your own version of WRF on DARWIN. 
-</note>   
  
  
Line 35: Line 32:
  
 <code bash> <code bash>
-  [(it_css:traine)@login00.darwin ~]$ WRF_PREFIX=/work/it_css/sw/wrf/20210121 ## MAKE SURE THAT THIS IS CHANGED TO YOUR WORK GROUP DIRECTORY OR HOME DIRECTORY ##+  [(it_css:traine)@login00.darwin ~]$ WRF_PREFIX=$WORKDIR/sw/wrf/20210121 ## MAKE SURE THAT THIS IS CHANGED TO YOUR WORK GROUP DIRECTORY OR HOME DIRECTORY ##
   [(it_css:traine)@login00.darwin ~]$ mkdir -p "$WRF_PREFIX"   [(it_css:traine)@login00.darwin ~]$ mkdir -p "$WRF_PREFIX"
   [(it_css:traine)@login00.darwin ~]$ WRF_SRC="${WRF_PREFIX}/src"   [(it_css:traine)@login00.darwin ~]$ WRF_SRC="${WRF_PREFIX}/src"
Line 75: Line 72:
 === Install netcdf library ===  === Install netcdf library === 
 <code bash> <code bash>
-  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf netcdf-4.1.3*+  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf netcdf-4.1.3.tar.gz
   [(it_css:traine)@login00.darwin LIBRARIES]$ cd netcdf-4.1.3   [(it_css:traine)@login00.darwin LIBRARIES]$ cd netcdf-4.1.3
   [(it_css:traine)@login00.darwin netcdf-4.1.3]$ ./configure --prefix="$WRF_PREFIX" --disable-dap --disable-netcdf-4 --disable-shared   [(it_css:traine)@login00.darwin netcdf-4.1.3]$ ./configure --prefix="$WRF_PREFIX" --disable-dap --disable-netcdf-4 --disable-shared
Line 84: Line 81:
 === Install zlib library === === Install zlib library ===
 <code bash> <code bash>
-  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf zlib-1.2.7.tar*+  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf zlib-1.2.7.tar.gz
   [(it_css:traine)@login00.darwin LIBRARIES]$ cd zlib-1.2.7   [(it_css:traine)@login00.darwin LIBRARIES]$ cd zlib-1.2.7
   [(it_css:traine)@login00.darwin zlib-1.2.7]$ ./configure --prefix="$WRF_PREFIX"   [(it_css:traine)@login00.darwin zlib-1.2.7]$ ./configure --prefix="$WRF_PREFIX"
Line 93: Line 90:
 === Install libpng library === === Install libpng library ===
 <code bash> <code bash>
-  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf libpng-1.2.50.tar*+  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf libpng-1.2.50.tar.gz
   [(it_css:traine)@login00.darwin LIBRARIES]$ cd libpng-1.2.50   [(it_css:traine)@login00.darwin LIBRARIES]$ cd libpng-1.2.50
   [(it_css:traine)@login00.darwin libpng-1.2.50]$ ./configure --prefix="$WRF_PREFIX"   [(it_css:traine)@login00.darwin libpng-1.2.50]$ ./configure --prefix="$WRF_PREFIX"
Line 102: Line 99:
 === Install jasper library === === Install jasper library ===
 <code bash> <code bash>
-  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf jasper-1.900.1.tar*+  [(it_css:traine)@login00.darwin LIBRARIES]$ tar -xf jasper-1.900.1.tar.gz
   [(it_css:traine)@login00.darwin LIBRARIES]$ cd jasper-1.900.1   [(it_css:traine)@login00.darwin LIBRARIES]$ cd jasper-1.900.1
   [(it_css:traine)@login00.darwin jasper-1.900.1]$ ./configure --prefix="$WRF_PREFIX"   [(it_css:traine)@login00.darwin jasper-1.900.1]$ ./configure --prefix="$WRF_PREFIX"
Line 161: Line 158:
 </code> </code>
 === Configure WRF parallel build === === Configure WRF parallel build ===
-Configuring parallel build. You will want to select option ''66''  and nesting option ''0''.+Configuring parallel build. You will want to select option ''66''  and nesting option ''1''.
 <code bash> <code bash>
   [(it_css:traine)@login00.darwin WRF]$ ./clean -a    [(it_css:traine)@login00.darwin WRF]$ ./clean -a 
Line 170: Line 167:
 <code bash> <code bash>
   [(it_css:traine)@login00.darwin WRF]$ patch -p1 <<EOT   [(it_css:traine)@login00.darwin WRF]$ patch -p1 <<EOT
-  --- A/configure.wrf     2020-12-10 14:06:01.907649095 -0500 +  --- A/configure.wrf 2020-12-10 14:06:01.907649095 -0500 
-  +++ B/configure.wrf     2020-12-10 14:40:00.791338460 -0500+  +++ B/configure.wrf 2020-12-10 14:40:00.791338460 -0500
   @@ -118,8 +118,8 @@   @@ -118,8 +118,8 @@
    SFC                   ifort    SFC                   ifort
Line 181: Line 178:
   +DM_CC                 mpicc   +DM_CC                 mpicc
    FC              =       time \$(DM_FC)    FC              =       time \$(DM_FC)
-   CC              =       \$(DM_CC) -DFSEEKO64_OK+   CC              =       \$(DM_CC) -DFSEEKO64_OK 
    LD              =       \$(FC)    LD              =       \$(FC)
   @@ -140,7 +140,7 @@   @@ -140,7 +140,7 @@
    BYTESWAPIO      =       -convert big_endian    BYTESWAPIO      =       -convert big_endian
    RECORDLENGTH    =       -assume byterecl    RECORDLENGTH    =       -assume byterecl
-   FCBASEOPTS_NO_G =       -ip -fp-model precise -w -ftz -align all -fno-alias \$(FORMAT_FREE) \$(BYTESWAPIO) -xHost -fp-model fast=2 -no-heap-arrays -no-prec-div -no-prec-sqrt -fno-common -xCORE-AVX2+   FCBASEOPTS_NO_G =       -ip -fp-model precise -w -ftz -align all -fno-alias \$(FORMAT_FREE) \$(BYTESWAPIO) -xHost -fp-model fast=2 -no-heap-arrays -no-prec-div -no-prec-sqrt -fno-common -maxvx2
   -FCBASEOPTS      =       \$(FCBASEOPTS_NO_G) \$(FCDEBUG)   -FCBASEOPTS      =       \$(FCBASEOPTS_NO_G) \$(FCDEBUG)
   +FCBASEOPTS      =       \$(FCBASEOPTS_NO_G) \$(FCDEBUG) -assume nocc_omp   +FCBASEOPTS      =       \$(FCBASEOPTS_NO_G) \$(FCDEBUG) -assume nocc_omp
-   MODULE_SRCH_FLAG =+   MODULE_SRCH_FLAG =     
    TRADFLAG        =      -traditional-cpp    TRADFLAG        =      -traditional-cpp
    CPP                  /lib/cpp -P -nostdinc    CPP                  /lib/cpp -P -nostdinc
Line 261: Line 258:
               description: 2020-12-10 build with Intel compilers and Open MPI               description: 2020-12-10 build with Intel compilers and Open MPI
               dependencies:               dependencies:
-                  - openmpi/4.0.2:intel+                  -  openmpi/4.0.5:intel-2020
      
    EOT    EOT
Line 269: Line 266:
 As mentioned at the top of this page these directions are based on user ''traine'' in workgroup ''it_css''. In these next directions we are going to add the WRF VALET package in the ''it_css'' workgroup directory. The location of the VALET directory might differ in your workgroup directory, or you might not yet have a VALET directory in your workgroup. If you do not have a valet directory, we suggest that you follow the directory structure we use in the ''it_css'' workgroup. Not only will it make it easier to follow these directions, but it will also make it easy to add more software packages and versions in the future. As mentioned at the top of this page these directions are based on user ''traine'' in workgroup ''it_css''. In these next directions we are going to add the WRF VALET package in the ''it_css'' workgroup directory. The location of the VALET directory might differ in your workgroup directory, or you might not yet have a VALET directory in your workgroup. If you do not have a valet directory, we suggest that you follow the directory structure we use in the ''it_css'' workgroup. Not only will it make it easier to follow these directions, but it will also make it easy to add more software packages and versions in the future.
 <code bash> <code bash>
-  [(it_css:traine)@login00.darwin ~]$ cd /work/it_css/sw+  [(it_css:traine)@login00.darwin ~]$ cd $WORKDIR/sw
   [(it_css:traine)@login00.darwin sw]$ mkdir -p valet           #This step is only required if you don't have a valet directory already created   [(it_css:traine)@login00.darwin sw]$ mkdir -p valet           #This step is only required if you don't have a valet directory already created
   [(it_css:traine)@login00.darwin sw]$ cat <<EOT > valet/wrf.vpkg_yaml   [(it_css:traine)@login00.darwin sw]$ cat <<EOT > valet/wrf.vpkg_yaml
Line 283: Line 280:
               description: 2020-12-10 build with Intel compilers and Open MPI               description: 2020-12-10 build with Intel compilers and Open MPI
               dependencies:               dependencies:
-                  - openmpi/4.0.2:intel+                  -  openmpi/4.0.5:intel-2020
      
    EOT    EOT
Line 309: Line 306:
 By default WRF and WPS are setup to run in their respected ''SRC'' directories. This is not the best idea and can cause serious problems to the WRF installation and also block the WRF software from being run by multiple people at the same time. The following steps will show you how to set up a safe environment to run WRF and WPS from a directory outside the ''SRC'' directories.  By default WRF and WPS are setup to run in their respected ''SRC'' directories. This is not the best idea and can cause serious problems to the WRF installation and also block the WRF software from being run by multiple people at the same time. The following steps will show you how to set up a safe environment to run WRF and WPS from a directory outside the ''SRC'' directories. 
 ==== Getting the example_wrf.tar file ==== ==== Getting the example_wrf.tar file ====
-By default WPS/WRF is set up to run from it's installed ''SRC'' directory. This is not ideal, so running WPS/WRF from another directory outside it's install directory is much safer and can help protect against accidental modifications or deletions of the originally installed source files. To set up the WPS and WRF to run their respected programs outside their installed ''SRC'' directory we have created an example to test called ''example_wrf.tar'' available in ''trainf'''s home directory to copy into the directory location in which you plan to download the data files and run all the WRF programs. For this demo, the ''example_wrf.tar'' file will be copied into the location ''/work/it_css/traine/wrf''.+By default WPS/WRF is set up to run from it's installed ''SRC'' directory. This is not ideal, so running WPS/WRF from another directory outside it's install directory is much safer and can help protect against accidental modifications or deletions of the originally installed source files. To set up the WPS and WRF to run their respected programs outside their installed ''SRC'' directory we have created an example to test called ''example_wrf.tar'' available in ''trainf'''s home directory to copy into the directory location in which you plan to download the data files and run all the WRF programs. For this demo, the ''example_wrf.tar'' file will be copied into the location ''$WORKDIR/traine/wrf''.
  
 <code bash>  <code bash> 
Line 317: Line 314:
 Now that you have the ''example_wrf.tar'' file where you want it, now we can extract it and and start to set up the WPS programs.  Now that you have the ''example_wrf.tar'' file where you want it, now we can extract it and and start to set up the WPS programs. 
 <code bash> <code bash>
-  [(it_css:traine)@login01.darwin wrf]$ tar -xvf example_wrf.tar+  [(it_css:traine)@login00.darwin wrf]$ tar -xvf example_wrf.tar
   example_wrf/   example_wrf/
   example_wrf/wrf_job/   example_wrf/wrf_job/
Line 334: Line 331:
 Now you need to download the **Statis Geography Data** and uncompress it by using the provided script ''wps_geo_download.sh'' provided in this example. This is a fairly large set of files (~29GB) and could take some download and uncompress. Now you need to download the **Statis Geography Data** and uncompress it by using the provided script ''wps_geo_download.sh'' provided in this example. This is a fairly large set of files (~29GB) and could take some download and uncompress.
 <code bash> <code bash>
-  [(it_css:traine)@login01.darwin wrf]$ cd example_wrf/data +  [(it_css:traine)@login00.darwin wrf]$ cd example_wrf/data 
-  [(it_css:traine)@login01.darwin data]$ ./wps_geo_download.sh+  [(it_css:traine)@login00.darwin data]$ ./wps_geo_download.sh
   ...MANY LINES AND MINUTES LATER...   ...MANY LINES AND MINUTES LATER...
 </code> </code>
Line 341: Line 338:
 If this is successful, then it is suggested that you remove the ''geog_high_res_mandatory.tar'' file. If this is successful, then it is suggested that you remove the ''geog_high_res_mandatory.tar'' file.
 <code bash> <code bash>
-  [(it_css:traine)@login01.darwin data]$ rm geog_high_res_mandatory.tar+  [(it_css:traine)@login00.darwin data]$ rm geog_high_res_mandatory.tar
 </code> </code>
  
 Now we will download some GFS model data. You will need to update the ''modelDataDownload.sh'' script provided in the example to the appropriate dates you want for the data available from NOAA's [[ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/| FTP site]]. The directory should be formatted something like ''gfs.YYYYMMDD''. In this example, we will be using ''gfs.20210107''. Now we will download some GFS model data. You will need to update the ''modelDataDownload.sh'' script provided in the example to the appropriate dates you want for the data available from NOAA's [[ftp://ftpprd.ncep.noaa.gov/pub/data/nccf/com/gfs/prod/| FTP site]]. The directory should be formatted something like ''gfs.YYYYMMDD''. In this example, we will be using ''gfs.20210107''.
 <code bash> <code bash>
-  [(it_css:traine)@login01.darwin data]$ cd GFS+  [(it_css:traine)@login00.darwin data]$ cd GFS
 </code> </code>
-In the ''modelDataDownload.sh'' file you will want to update the ''yyyy'', ''mm'', ''dd'', ''hh'', ''fff'', and ''model'' to match the date and model data that you are trying to download. This example used ''01/07/2021'' for the ''00Z'' run for the GFS model.+In the ''modelDataDownload.sh'' file you will want to update the ''yyyy'', ''mm'', ''dd'', ''hh'', ''fff'', and ''model'' to match the date and model data that you are trying to download. This example used ''01/07/2021'' for the ''00Z'' run for the GFS model. You also need to put in your email address from 
 <note important>This data is not longer available on NOAA's FTP website, so the example below will not work if you copy it directly and do not change the dates. </note> <note important>This data is not longer available on NOAA's FTP website, so the example below will not work if you copy it directly and do not change the dates. </note>
  
-After making the necessary changes, you can run ''modelDataDownload.sh'' script. To do this you will want to use ''sbatch'' and the ''download.qs'' script. The ''download.qs'' will run the modelDataDownload.sh script on a compute node to download all the requested model data. +After making the necessary changes, you can run ''modelDataDownload.sh'' script. To do this you will want to use ''sbatch'' and the ''download.qs'' script. The ''download.qs'' will run the ''modelDataDownload.sh'' script on a compute node to download all the requested model data. 
  
 <code bash> <code bash>
-  [(it_css:traine)@login01.darwin GFS]$ sbatch download.qs +  [(it_css:traine)@login00.darwin GFS]$ sbatch download.qs 
-  [(it_css:traine)@login01.darwin GFS]$ ls +  [(it_css:traine)@login00.darwin GFS]$ ls 
   gfs_20210107_00z_000f  gfs_20210107_00z_078f  gfs_20210107_00z_156f  gfs_20210107_00z_234f  gfs_20210107_00z_312f   gfs_20210107_00z_000f  gfs_20210107_00z_078f  gfs_20210107_00z_156f  gfs_20210107_00z_234f  gfs_20210107_00z_312f
   gfs_20210107_00z_003f  gfs_20210107_00z_081f  gfs_20210107_00z_159f  gfs_20210107_00z_237f  gfs_20210107_00z_315f   gfs_20210107_00z_003f  gfs_20210107_00z_081f  gfs_20210107_00z_159f  gfs_20210107_00z_237f  gfs_20210107_00z_315f
Line 367: Line 364:
 To prevent an issue later down the road, we will need to move the ''modelDataDownload.sh'' script up one directory into the ''data'' directory. To prevent an issue later down the road, we will need to move the ''modelDataDownload.sh'' script up one directory into the ''data'' directory.
 <code bash> <code bash>
-  [(it_css:traine)@login01.darwin GFS]$ mv modelDataDownload.sh download.qs ../+  [(it_css:traine)@login00.darwin GFS]$ mv modelDataDownload.sh download.qs ../
 </code> </code>
 Now we have the required metadata files and sample GFS data downloaded to run WPS and WRF. Now we have the required metadata files and sample GFS data downloaded to run WPS and WRF.
Line 374: Line 371:
 To avoid running executable on the login node, we are going to request a interactive compute node session using ''salloc''. To avoid running executable on the login node, we are going to request a interactive compute node session using ''salloc''.
 <code bash> <code bash>
-  [(it_css:traine)@login01 GFS]salloc --partition=it_css+  [(it_css:traine)@login00.darwin GFS] salloc --partition=it_css
   salloc: Granted job allocation 11715984   salloc: Granted job allocation 11715984
   salloc: Waiting for resource configuration   salloc: Waiting for resource configuration
Line 383: Line 380:
 Now we are ready to process the downloaded GFS data with the WPS scripts for our example. To do this we will need to change directories into the the ''wps_job'' directory, then use of the ''make'' command to gather the require files that you will need to run WPS. Now we are ready to process the downloaded GFS data with the WPS scripts for our example. To do this we will need to change directories into the the ''wps_job'' directory, then use of the ''make'' command to gather the require files that you will need to run WPS.
 <code bash> <code bash>
-  [triane@r00n49 GFS]$  cd ../../wps_job/ +  [(it_css:traine)@r1n00 data]$  cd ../../wps_job/ 
-  [triane@r00n49 wps_job]$  make+  [(it_css:traine)@r1n00 wps_job]$  make
   cp /work/it_css/sw/WRF/20210113/src/WPS/geogrid/GEOGRID.TBL GEOGRID.TBL   cp /work/it_css/sw/WRF/20210113/src/WPS/geogrid/GEOGRID.TBL GEOGRID.TBL
   cp /work/it_css/sw/WRF/20210113/src/WPS/metgrid/METGRID.TBL METGRID.TBL   cp /work/it_css/sw/WRF/20210113/src/WPS/metgrid/METGRID.TBL METGRID.TBL
Line 475: Line 472:
 Next run the ''link_grib.csh'' script and make sure to pass the correct path to where you are storing the downloaded GFS data.  Next run the ''link_grib.csh'' script and make sure to pass the correct path to where you are storing the downloaded GFS data. 
 <code> <code>
-  [triane@r00n49 wps_job]$  ./link_grib.csh ../data/GFS/+  [(it_css:traine)@r1n00 wps_job]$  ./link_grib.csh ../data/GFS/
 </code> </code>
 This script will create soft links to the GFS data files into the current directory. You should see something like this This script will create soft links to the GFS data files into the current directory. You should see something like this
 <code bash> <code bash>
-  [triane@r00n49 wps_job]$  ls -l GRIB*+  [(it_css:traine)@r1n00 wps_job]$  ls -l GRIB*
   lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAA -> ../data/GFS/gfs_20210107_00z_000f   lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAA -> ../data/GFS/gfs_20210107_00z_000f
   lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAB -> ../data/GFS/gfs_20210107_00z_003f   lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAB -> ../data/GFS/gfs_20210107_00z_003f
Line 508: Line 505:
 The last step is to run ''metgrid.exe'' The last step is to run ''metgrid.exe''
 <code bash> <code bash>
-  [triane@r00n49 wps_job]$ metgrid.exe+  [(it_css:traine)@r1n00 wps_job]$ metgrid.exe
   ... #MANY LINES & MINUTES LATER   ... #MANY LINES & MINUTES LATER
   Processing 2021-01-13_21   Processing 2021-01-13_21
Line 518: Line 515:
 Now you can end the salloc session and release the compute node  Now you can end the salloc session and release the compute node 
 <code bash> <code bash>
-  [triane@r00n49 wps_job]$ exit+  [(it_css:traine)@r1n00 wps_job]$ exit
   exit   exit
-  [(it_css:traine)@login01 wps_job]$+  [(it_css:traine)@login00.darwin wps_job]$
 </code> </code>
 Now you are ready to set up WRF. Now you are ready to set up WRF.
Line 528: Line 525:
 If you have not followed all the prior steps leading up to this point **YOU MUST go back and complete them**. The WPS steps to download and process files in them are required for the WRF steps to work. After you have completed the prior steps, then you can go into the ''wrf_job'' directory. If you have not followed all the prior steps leading up to this point **YOU MUST go back and complete them**. The WPS steps to download and process files in them are required for the WRF steps to work. After you have completed the prior steps, then you can go into the ''wrf_job'' directory.
 <code bash> <code bash>
-  [(it_css:traine)@login01 wps_job]$ ../wrf_job +  [(it_css:traine)@login00.darwin wps_job]$ ../wrf_job 
-  [(it_css:traine)@login01 wrf_job]$ pwd+  [(it_css:traine)@login00.darwin wrf_job]$ pwd
   /work/it_css/sw/WRF/example_wrf/wrf_job   /work/it_css/sw/WRF/example_wrf/wrf_job
 </code> </code>
 Now run the ''make'' command. This will copy all the required files needed to run WRF. Now run the ''make'' command. This will copy all the required files needed to run WRF.
 <code bash> <code bash>
-  [(it_css:traine)@login01 wrf_job]$ make +  [(it_css:traine)@login00.darwin wrf_job]$ make 
   cp /work/it_css/sw/WRF/20210113/src/WRF/run/GENPARM.TBL GENPARM.TBL   cp /work/it_css/sw/WRF/20210113/src/WRF/run/GENPARM.TBL GENPARM.TBL
   cp /work/it_css/sw/WRF/20210113/src/WRF/run/HLC.TBL HLC.TBL   cp /work/it_css/sw/WRF/20210113/src/WRF/run/HLC.TBL HLC.TBL
Line 555: Line 552:
 Next copy the ''met_em'' files that were created in the WPS steps.  Next copy the ''met_em'' files that were created in the WPS steps. 
 <code bash> <code bash>
-    [(it_css:traine)@login01 wrf_job]$ cp ../wps_job/met_em* . +  [(it_css:traine)@login00.darwin wrf_job]$ cp ../wps_job/met_em* . 
 </code> </code>
 Now you should be ready to run WRF. Now you should be ready to run WRF.
Line 614: Line 611:
   /   /
 </code> </code>
-To run WRF, we will use the ''wrfmpi.qs'' file. This job script is based on the script template found in ''/opt/templates/slurm/generic/mpi/openmpi/openmpi.qs''. The ''openmpi.qs'' job script file is updated regularly, so it is suggested that you also check and compare the contents of the ''openmpi.qs'' files to see if there has been any major changes to it. If there has been, then you might want to consider adding those changes to the ''wrfmpi.qs'' job script. To run WRF, you will use ''sbatch'' to submit the job to Slurm.+To run WRF, we will use the ''wrfmpi.qs'' file. This job script is based on the script template found in ''/opt/shared/templates/slurm/generic/mpi/openmpi/openmpi.qs''. The ''openmpi.qs'' job script file is updated regularly, so it is suggested that you also check and compare the contents of the ''openmpi.qs'' files to see if there has been any major changes to it. If there has been, then you might want to consider adding those changes to the ''wrfmpi.qs'' job script. To run WRF, you will use ''sbatch'' to submit the job to Slurm.
 <code bash> <code bash>
   [(it_css:traine)@login01 wrf_job]$ sbatch wrfmpi.qs   [(it_css:traine)@login01 wrf_job]$ sbatch wrfmpi.qs
  • software/wrf/darwin.1612468685.txt.gz
  • Last modified: 2021-02-04 14:58
  • by mkyle