Both sides previous revision Previous revision Next revision | Previous revision Next revisionBoth sides next revision |
software:wrf:caviness [2021-02-03 13:15] – [Example WPS run] mkyle | software:wrf:caviness [2021-04-27 16:21] – external edit 127.0.0.1 |
---|
</code> | </code> |
=== Configure WRF parallel build === | === Configure WRF parallel build === |
Configuring parallel build. You will want to select option ''66'' and nesting option ''0''. | Configuring parallel build. You will want to select option ''66'' and nesting option ''1''. |
<code bash> | <code bash> |
[(it_css:traine)@login01 WRF]$ ./clean -a | [(it_css:traine)@login01 WRF]$ ./clean -a |
<code bash> | <code bash> |
[(it_css:traine)@login01 WRF]$ patch -p1 <<EOT | [(it_css:traine)@login01 WRF]$ patch -p1 <<EOT |
--- A/configure.wrf 2020-12-10 14:06:01.907649095 -0500 | --- A/configure.wrf 2020-12-10 14:06:01.907649095 -0500 |
+++ B/configure.wrf 2020-12-10 14:40:00.791338460 -0500 | +++ B/configure.wrf 2020-12-10 14:40:00.791338460 -0500 |
@@ -118,8 +118,8 @@ | @@ -118,8 +118,8 @@ |
SFC = ifort | SFC = ifort |
+DM_CC = mpicc | +DM_CC = mpicc |
FC = time \$(DM_FC) | FC = time \$(DM_FC) |
CC = \$(DM_CC) -DFSEEKO64_OK | CC = \$(DM_CC) -DFSEEKO64_OK |
LD = \$(FC) | LD = \$(FC) |
@@ -140,7 +140,7 @@ | @@ -140,7 +140,7 @@ |
-FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) | -FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) |
+FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) -assume nocc_omp | +FCBASEOPTS = \$(FCBASEOPTS_NO_G) \$(FCDEBUG) -assume nocc_omp |
MODULE_SRCH_FLAG = | MODULE_SRCH_FLAG = |
TRADFLAG = -traditional-cpp | TRADFLAG = -traditional-cpp |
CPP = /lib/cpp -P -nostdinc | CPP = /lib/cpp -P -nostdinc |
/ | / |
</code> | </code> |
To run WRF, we will use the ''wrfmpi.qs'' file. This job script is based on the script template found in ''/opt/templates/slurm/generic/mpi/openmpi/openmpi.qs''. The ''openmpi.qs'' job script file is updated regularly, so it is suggested that you also check and compare the contents of the ''openmpi.qs'' files to see if there has been any major changes to it. If there has been, then you might want to consider adding those changes to the ''wrfmpi.qs'' job script. To run WRF, you will use ''sbatch'' to submit the job to Slurm. | To run WRF, we will use the ''wrfmpi.qs'' file. This job script is based on the script template found in ''/opt/shared/templates/slurm/generic/mpi/openmpi/openmpi.qs''. The ''openmpi.qs'' job script file is updated regularly, so it is suggested that you also check and compare the contents of the ''openmpi.qs'' files to see if there has been any major changes to it. If there has been, then you might want to consider adding those changes to the ''wrfmpi.qs'' job script. To run WRF, you will use ''sbatch'' to submit the job to Slurm. |
<code bash> | <code bash> |
[(it_css:traine)@login01 wrf_job]$ sbatch wrfmpi.qs | [(it_css:traine)@login01 wrf_job]$ sbatch wrfmpi.qs |