Both sides previous revision Previous revision Next revision | Previous revisionLast revisionBoth sides next revision |
software:wrf:caviness [2021-02-02 09:37] – [Test library compatibility] anita | software:wrf:caviness [2024-03-29 15:01] – [Set up the installation environment] anita |
---|
The next commands load the required VALET packages and set the necessary environment variables. | The next commands load the required VALET packages and set the necessary environment variables. |
| |
Use the VALET commmand ''vpkg_require'' to load the appropriate version of the ''openmpi'' package | Use the VALET commmand ''vpkg_require'' to load the appropriate version of the ''openmpi'' package. |
| |
<code bash> | <code bash> |
</code> | </code> |
| |
Set the environment variables needed for compiling and installing the WRF packages | Set the environment variables needed for compiling and installing the WRF packages. |
| |
<code bash> | <code bash> |
[(it_css:traine)@login01 ~]$ WRF_PREFIX=/work/it_css/sw/wrf/20210121 ## MAKE SURE THAT THIS IS CHANGED TO YOUR WORK GROUP DIRECTORY OR HOME DIRECTORY ## | [(it_css:traine)@login01 ~]$ WRF_PREFIX=$WORKDIR/sw/wrf/20210121 ## MAKE SURE THAT THIS IS CHANGED TO YOUR WORK GROUP DIRECTORY OR HOME DIRECTORY ## |
[(it_css:traine)@login01 ~]$ mkdir -p "$WRF_PREFIX" | [(it_css:traine)@login01 ~]$ mkdir -p "$WRF_PREFIX" |
[(it_css:traine)@login01 ~]$ WRF_SRC="${WRF_PREFIX}/src" | [(it_css:traine)@login01 ~]$ WRF_SRC="${WRF_PREFIX}/src" |
</code> | </code> |
==== Test library compatibility ==== | ==== Test library compatibility ==== |
Use the environment variable ''WRF_TEST_SRC'' to change into the ''TESTS'' directory. Next download the test file, compile them using the libraries you installed and verify the results. | Use the environment variable ''WRF_TEST_SRC'' to change into the ''TESTS'' directory. Next download the test file, compile them using the libraries you installed and verify the results. Remember all executables should typically be run on compute nodes. However these tests require compiling which must be done either on the login (head) node or by using ''salloc %%--%%partition=devel'' to run on compute nodes, and since these tests will run quickly and only use minimal resources to verify the correct installation of the libraries, we will use the login (head) node for this demonstration. |
<code bash> | <code bash> |
[(it_css:traine)@login01 LIBRARIES]$ cd "$WRF_TESTS_SRC" | [(it_css:traine)@login01 LIBRARIES]$ cd "$WRF_TESTS_SRC" |
</code> | </code> |
=== Configure WRF parallel build === | === Configure WRF parallel build === |
Configuring parallel build. You will want to select option ''66'' and nesting option ''0''. | Configuring parallel build. You will want to select option ''66'' and nesting option ''1''. |
<code bash> | <code bash> |
[(it_css:traine)@login01 WRF]$ ./clean -a | [(it_css:traine)@login01 WRF]$ ./clean -a |
</code> | </code> |
=== Apply required patch for OpenMPI flags === | === Apply required patch for OpenMPI flags === |
The patch command with the use of ''EOT'' does not display the prompt after it is entered. Instead it is waiting for the patch lines to be entered at the terminal until ''EOT'' is typed. **You may find it best to copy and paste the lines below to avoid making any typing mistakes as it must be exactly done this way.**. | The patch command with the use of ''EOT'' does not display the bash prompt after it is entered, but instead ''>'', because it is waiting for the patch lines to be entered at the terminal until ''EOT'' is typed. **You may find it best to copy and paste the lines below to avoid making any typing mistakes as spacing is very important and it must be typed exactly done this way.**. |
<code bash> | <code bash> |
[(it_css:traine)@login01 WRF]$ patch -p1 <<EOT | [(it_css:traine)@login01 WRF]$ patch -p1 <<EOT |
--- A/configure.wrf 2020-12-10 14:06:01.907649095 -0500 | --- A/configure.wrf 2020-12-10 14:06:01.907649095 -0500 |
+++ B/configure.wrf 2020-12-10 14:40:00.791338460 -0500 | +++ B/configure.wrf 2020-12-10 14:40:00.791338460 -0500 |
@@ -118,8 +118,8 @@ | @@ -118,8 +118,8 @@ |
SFC = ifort | SFC = ifort |
+DM_CC = mpicc | +DM_CC = mpicc |
FC = time \$(DM_FC) | FC = time \$(DM_FC) |
CC = \$(DM_CC) -DFSEEKO64_OK | CC = \$(DM_CC) -DFSEEKO64_OK |
LD = \$(FC) | LD = \$(FC) |
@@ -140,7 +140,7 @@ | @@ -140,7 +140,7 @@ |
-FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) | -FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) |
+FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) -assume nocc_omp | +FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) -assume nocc_omp |
MODULE_SRCH_FLAG = | MODULE_SRCH_FLAG = |
TRADFLAG = -traditional-cpp | TRADFLAG = -traditional-cpp |
CPP = /lib/cpp -P -nostdinc | CPP = /lib/cpp -P -nostdinc |
| |
<note important> | <note important> |
* WPS unlike WRF only has a serial build and currently does not have parallel build | * WPS unlike WRF only has a serial build and currently does not have parallel build. |
* The compile steps can take quite a long time to complete. Use caution and make sure that you select the correct options as directed below. | * The compile steps can take quite a long time to complete. Use caution and make sure that you select the correct options as directed below. |
</note> | </note> |
=== Adding WRF VALET package in HOME directory === | === Adding WRF VALET package in HOME directory === |
<code bash> | <code bash> |
[(it_css:traine)@login01 WRF]$ mkdir -p ~/.valet | [(it_css:traine)@login01 WRF]$ mkdir -p ~/.valet #This step is only required if you don't have a .valet directory already created |
[(it_css:traine)@login01 WRF]$ cat <<EOT > ~/.valet/wrf.vpkg_yaml | [(it_css:traine)@login01 WRF]$ cat <<EOT > ~/.valet/wrf.vpkg_yaml |
# | # |
<code bash> | <code bash> |
[(it_css:traine)@login01 ~]$ cd /work/it_css/sw | [(it_css:traine)@login01 ~]$ cd /work/it_css/sw |
[(it_css:traine)@login01 sw]$ mkdir -p valet #This step is only required if you don't have a valet directory already created | [(it_css:traine)@login01 sw]$ mkdir -p valet #This step is only required if you don't have a valet directory already created |
[(it_css:traine)@login01 sw]$ cat <<EOT > valet/wrf.vpkg_yaml | [(it_css:traine)@login01 sw]$ cat <<EOT > valet/wrf.vpkg_yaml |
# | # |
| |
==== Example WPS run ==== | ==== Example WPS run ==== |
To avoid running executable on the login node we are going to request a interactive compute node session using ''salloc''. | To avoid running executable on the login node, we are going to request a interactive compute node session using ''salloc''. |
<code bash> | <code bash> |
[(it_css:traine)@login01 GFS]$ salloc --partition=it_css | [(it_css:traine)@login01 GFS]$ salloc --partition=it_css |
</code> | </code> |
| |
Now we are ready to process the downloaded GFS data with the WPS scripts for our example. To do this we will need to change directories into the the ''wps_job'' directory. Now use of the ''Makefile'' to gather the require files that you will need to run WPS. | Now we are ready to process the downloaded GFS data with the WPS scripts for our example. To do this we will need to change directories into the the ''wps_job'' directory, then use of the ''make'' command to gather the require files that you will need to run WPS. |
<code bash> | <code bash> |
[triane@r00n49 GFS]$ cd ../../wps_job/ | [triane@r00n49 GFS]$ cd ../../wps_job/ |
cp /work/it_css/sw/WRF/20210113/src/WPS/link_grib.csh link_grib.csh | cp /work/it_css/sw/WRF/20210113/src/WPS/link_grib.csh link_grib.csh |
</code> | </code> |
After that is completed you will need to update the ''namelist.wps'' file to reflect the model data that you downloaded in the prior steps. | After that is completed you will need to update the ''namelist.wps'' file to reflect the model data that you downloaded in the prior steps. Use ''nano'' or ''vim'' and make the necessary changes noted with ''#EDIT THIS LINE'' and ''#ADD THIS LINE''. |
<code bash> | <code bash> |
[triane@r00n49 wps_job]$ vim namelist.wps | |
| |
| |
&share | &share |
wrf_core = 'ARW', | wrf_core = 'ARW', |
max_dom = 1, | max_dom = 1, |
start_date = '2021-01-07_00:00:00','2006-08-16_12:00:00', #edit this line | start_date = '2021-01-07_00:00:00','2006-08-16_12:00:00', #EDIT THIS LINE |
end_date = '2021-01-23_23:00:00','2006-08-16_12:00:00', #edit this line | end_date = '2021-01-23_23:00:00','2006-08-16_12:00:00', #EDIT THIS LINE |
interval_seconds = 10800 #edit this line | interval_seconds = 10800 #edit this line |
io_form_geogrid = 2, | io_form_geogrid = 2, |
i_parent_start = 1, 31, | i_parent_start = 1, 31, |
j_parent_start = 1, 17, | j_parent_start = 1, 17, |
e_we = 10, 112, #edit this line | e_we = 10, 112, #EDIT THIS LINE |
e_sn = 10, 97, #edit this line | e_sn = 10, 97, #EDIT THIS LINE |
... | ... |
geog_data_res = 'default','default', | geog_data_res = 'default','default', |
truelat2 = 38.72, #edit this line | truelat2 = 38.72, #edit this line |
stand_lon = -75.08, #edit this line | stand_lon = -75.08, #edit this line |
geog_data_path = '../data/WPS_GEOG/', #edit this line | geog_data_path = '../data/WPS_GEOG/', #EDIT THIS LINE |
opt_geogrid_tbl_path = './' #optional edit | opt_geogrid_tbl_path = './' #OPTIONAL EDIT THIS LINE |
... | ... |
&metgrid | &metgrid |
fg_name = 'FILE' | fg_name = 'FILE' |
io_form_metgrid = 2, | io_form_metgrid = 2, |
opt_metgrid_tbl_path = './' #Add this line | opt_metgrid_tbl_path = './' #ADD THIS LINE |
/ | / |
</code> | </code> |
[triane@r00n49 wps_job]$ ./link_grib.csh ../data/GFS/ | [triane@r00n49 wps_job]$ ./link_grib.csh ../data/GFS/ |
</code> | </code> |
This script will create soft links to the GFS data files into the current directory. You should see something like | This script will create soft links to the GFS data files into the current directory. You should see something like this |
<code bash> | <code bash> |
[triane@r00n49 wps_job]$ ls -l GRIB* | [triane@r00n49 wps_job]$ ls -l GRIB* |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAA -> ../data/GFS/gfs_20210107_00z_000f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAA -> ../data/GFS/gfs_20210107_00z_000f |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAB -> ../data/GFS/gfs_20210107_00z_003f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAB -> ../data/GFS/gfs_20210107_00z_003f |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAC -> ../data/GFS/gfs_20210107_00z_006f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAC -> ../data/GFS/gfs_20210107_00z_006f |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAD -> ../data/GFS/gfs_20210107_00z_009f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAD -> ../data/GFS/gfs_20210107_00z_009f |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAE -> ../data/GFS/gfs_20210107_00z_012f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAE -> ../data/GFS/gfs_20210107_00z_012f |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAF -> ../data/GFS/gfs_20210107_00z_015f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAF -> ../data/GFS/gfs_20210107_00z_015f |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAG -> ../data/GFS/gfs_20210107_00z_018f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAG -> ../data/GFS/gfs_20210107_00z_018f |
lrwxrwxrwx 1 mkyle it_css 33 Jan 26 13:59 GRIBFILE.AAH -> ../data/GFS/gfs_20210107_00z_021f | lrwxrwxrwx 1 traine it_css 33 Jan 26 13:59 GRIBFILE.AAH -> ../data/GFS/gfs_20210107_00z_021f |
... | ... |
</code> | </code> |
[triane@r00n49 wps_job]$ ungrib.exe | [triane@r00n49 wps_job]$ ungrib.exe |
</code> | </code> |
Depending on the amount of data you can this could take a several minutes. If you successful you should get a message like this at the end. | Depending on the amount of data, this could take several minutes. If successful, then you should get a message like this at the end. |
<code bash> | <code bash> |
... #MANY LINES & MINUTES LATER | ... #MANY LINES & MINUTES LATER |
The last step is to run ''metgrid.exe'' | The last step is to run ''metgrid.exe'' |
<code bash> | <code bash> |
[triane@r00n49 wps_job]$ srun metgrid.exe | [triane@r00n49 wps_job]$ metgrid.exe |
... #MANY LINES & MINUTES LATER | ... #MANY LINES & MINUTES LATER |
Processing 2021-01-13_21 | Processing 2021-01-13_21 |
/work/it_css/sw/WRF/example_wrf/wrf_job | /work/it_css/sw/WRF/example_wrf/wrf_job |
</code> | </code> |
Now run the ''Makefile''. This will copy all the required files needed to run WRF. | Now run the ''make'' command. This will copy all the required files needed to run WRF. |
<code bash> | <code bash> |
[(it_css:traine)@login01 wrf_job]$ make | [(it_css:traine)@login01 wrf_job]$ make |
In this example, we are running the WRF ''real.exe'' and ''wrf.exe'' in ''WRF/run''. These steps should be similar to the steps required to run WRF in ''/WRF/test/em_real'', but there might be some differences that are not covered in these directions. | In this example, we are running the WRF ''real.exe'' and ''wrf.exe'' in ''WRF/run''. These steps should be similar to the steps required to run WRF in ''/WRF/test/em_real'', but there might be some differences that are not covered in these directions. |
| |
First, changes need to be made to the ''namelist.input'' file. These changes should reflect the information from the model data that you downloaded earlier. | First, changes need to be made to the ''namelist.input'' file by using ''nano'' or ''vim''. These changes should reflect the information from the model data that you downloaded earlier. See all lines marked with ''#EDIT THIS LINE'' and change accordingly. |
<code bash> | <code bash> |
&time_control | &time_control |
/ | / |
</code> | </code> |
To run WRF, we will use the ''wrfmpi.qs'' file. This job script is based on the script template found in ''/opt/templates/slurm/generic/mpi/openmpi/openmpi.qs''. The ''openmpi.qs'' job script file is updated regularly, so it is suggested that you also check and compare the contents of the ''openmpi.qs'' files to see if there has been any major changes to it. If there has been, then you might want to consider adding those changes to the ''wrfmpi.qs'' job script. To run WRF, you will use ''sbatch'' to submit the job to Slurm. | To run WRF, we will use the ''wrfmpi.qs'' file. This job script is based on the script template found in ''/opt/shared/templates/slurm/generic/mpi/openmpi/openmpi.qs''. The ''openmpi.qs'' job script file is updated regularly, so it is suggested that you also check and compare the contents of the ''openmpi.qs'' files to see if there has been any major changes to it. If there has been, then you might want to consider adding those changes to the ''wrfmpi.qs'' job script. To run WRF, you will use ''sbatch'' to submit the job to Slurm. |
<code bash> | <code bash> |
[(it_css:traine)@login01 wrf_job]$ sbatch wrfmpi.qs | [(it_css:traine)@login01 wrf_job]$ sbatch wrfmpi.qs |