LS-DYNA is an explicit dynamics solver that simulates the extreme materials deformation response of structures to periods of severe loading. It is a commercial product, so prior to using it on an IT RCI cluster a license must be purchased. A floating (network-based) license is required for use of LS-DYNA on clusters; the user must coordinate with his/her IT support staff to have a license server present on the UD campus network.
LS-DYNA is distributed for Windows and Linux operating systems. Many build variants exist for each release, varying by:
Determining which version(s) and variant(s) are needed depends on the hardware present and the computational work being done. As of mid 2024, the current version is R15.0.2 and the best baseline variants to install on Caviness/DARWIN are the hybrid AVX2 variants in both single- and double-precision numerics.
For this example, LS-DYNA will be installed in a workgroup's storage. The base path $WORKDIR
will be referenced in this recipe, which implies that prior to installation the workgroup
command was used to spawn a workgroup shell. The recipe assumes version R15.0.2 and the variants mentioned above.
The LS-DYNA software packages must be downloaded to the cluster. To facilitate that, a directory must be created to hold those files. Rather than type-out the full path every time, the variable LS_DYNA_PREFIX
will be set to the base directory containing LS-DYNA software:
$ LS_DYNA_PREFIX="${WORKDIR}/sw/ls-dyna" $ mkdir -p "${LS_DYNA_PREFIX}/attic/15.0.2"
(The -p
ensures that any of the directories in that path that do not exist are created.) The files can be downloaded on a personal computer and uploaded to the directory just created, but they can also be downloaded directly on the cluster:
$ wget --directory-prefix="${LS_DYNA_PREFIX}/attic/15.0.2" \ --user=<USERNAME> --password=<PASSWORD> \ '<DOWNLOAD-URL>'
The <USERNAME>
and <PASSWORD>
(as well as the download site URL) are provided when a license is purchased. The <DOWNLOAD-URL>
is determined by browsing the download site and copying the link that would be clicked to download to a personal computer. For example, browsing and copying the link to download R15.0.2 for hybrid Open MPI with AVX2 instructions and single-precision numerics on Linux is
https://ftp.lstc.com/user/mpp-dyna/R15.0.2/linux/x86-64/ifort_190_avx2/HYB/ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.tgz_extractor.sh
Downloading (or uploading) both packages, the attic directory now contains:
$ ls -l "${LS_DYNA_PREFIX}/attic/15.0.2" total 234420 -rw-r--r-- 1 user group 140454912 Apr 7 22:15 ls-dyna_hyb_d_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.tgz_extractor.sh -rw-r--r-- 1 user group 99661023 Apr 7 22:15 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.tgz_extractor.sh
In preparation for installing these packages, a hierarchy of directories must be created to hold the variants of R15.0.2. Even though just two specific variants will be installed, the entire hierarchy is created:
The directory name matches the version, 15.0.2, with subdirectories for each option:
$ mkdir -p "${LS_DYNA_PREFIX}/15.0.2/"{sse2,avx2,avx512}/{smp,hyb,mpp}/{double,single}
The single-precision variant is installed by changing to its subdirectory (created above) and running the appropriate installation extractor:
$ pushd "${LS_DYNA_PREFIX}/15.0.2/avx2/hyb/single" $ sh "${LS_DYNA_PREFIX}/attic/15.0.2/ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.tgz_extractor.sh"
There will be several files in the directory, including the LS-DYNA executable:
$ ls -l total 154359 -rwxr-xr-x 1 user group 11402168 Mar 29 18:05 ansyscl -rwxr-xr-x 1 user group 219543680 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405 -rwxr-xr-x 1 user group 437864 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.l2a
Since the executable has a rather complicated name, a symlink is created so that the command ls-dyna
is mapped to it:
$ ln -s ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405 ls-dyna $ ls -l total 154359 -rwxr-xr-x 1 user group 11402168 Mar 29 18:05 ansyscl lrwxrwxrwx 1 user group 59 Jun 7 10:40 ls-dyna -> ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405 -rwxr-xr-x 1 user group 219543680 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405 -rwxr-xr-x 1 user group 437864 Mar 29 18:14 ls-dyna_hyb_s_R15_0_2_x64_centos79_ifort190_avx2_openmpi405.l2a $ popd
This variant of LS-DYNA R15.0.2 is now installed, but there are underlying prerequisites to its usability: the Open MPI library on which it was built must also be made available. That step will be covered in a subsequent section.
For each additional variant of R15.0.2, the steps above are repeated with the appropriate modifications to the paths and file names. All hybrid and MPP variants will require the availability of their underlying MPI library.
As mentioned, the variants installed in this recipe are built atop Open MPI. As the file names indicate (and as documented in the R15.0.2 release notes), Open MPI 4.0.5 and the Intel 2019 compiler were used to build the two variants in question. Caviness/DARWIN have Intel 2019 compilers present, but the exact release of Open MPI built with that compiler may not be present. An instance of Open MPI 4.0.5 compiled with Intel 2019 should be produced for the workgroup's usage.
The procedure is presented as succinctly as possible and starts with setup of the directories and unpacking of the source code:
$ mkdir -p "${WORKDIR}/sw/openmpi/attic" $ wget --directory-prefix="${WORKDIR}/sw/openmpi/attic" \ https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.5.tar.bz2 $ mkdir -p "${WORKDIR}/sw/openmpi/src" $ pushd "${WORKDIR}/sw/openmpi/src" $ tar -xf "${WORKDIR}/sw/openmpi/attic/openmpi-4.0.5.tar.bz2" $ cd openmpi-4.0.5 $ mkdir build-intel19-ls-dyna $ cd build-intel19-ls-dyna $ vpkg_rollback
For Caviness, the OFI libfabric communications library is used
$ vpkg_devrequire intel/2019 libfabric/1.17.1 $ ../configure --prefix="${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna" \ --with-lustre --with-io-romio-flags=--with-file-system=ufs+nfs+lustre \ --with-libfabric="$LIBFABRIC_PREFIX" \ CC=icc CXX=icpc FC=ifort
versus the OS-provided UCX library used on DARWIN
$ vpkg_devrequire intel/2019 $ ../configure --prefix="${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna" \ --with-hwloc=/opt/shared/hwloc/2 \ --with-pmix=/opt/shared/pmix/3 \ --with-libevent=/usr \ --with-lustre \ --with-io-romio-flags="--with-file-system=ufs+nfs+lustre" \ CC=icc CXX=icpc FC=ifort
If the configuration is successful, the software can be built and installed (this will take some time):
$ make $ make install $ vpkg_rollback $ popd
There are likely local configuration details associated with Open MPI that are effected via the Open MPI configuration files. A copy of the nearest-matching version maintained by IT RCI should be made. For example, on Caviness
$ cp -f /opt/shared/openmpi/4.0.3/etc/{openmpi,pmix}-mca-params.conf "${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna/etc"
versus on DARWIN
$ cp -f /opt/shared/openmpi/4.0.5/etc/{openmpi,pmix}-mca-params.conf "${WORKDIR}/sw/openmpi/4.0.5-intel19-ls-dyna/etc"
To make this version of Open MPI available to your LS-DYNA installs, a VALET package definition file should be created. The text <OPENMPI-INSTALL-DIR>
should be replaced with the base installation path used above (e.g. /work/group/sw/openmpi
).
If the workgroup already has an existing ${WORKDIR}/sw/valet/openmpi.vpkg_yaml
file, the new version should be added to that file: do not simply overwrite the file with the content shown below!
For the configuration made on Caviness above:
openmpi: description: "Open MPI: Message-Passing Interface" url: http://www.open-mpi.org/ prefix: <OPENMPI-INSTALL-DIR> versions: "4.0.5:intel19,ls-dyna": description: Open MPI 4.0.5 for LS-DYNA, Intel 2019 dependencies: - intel/2019 - libfabric/1.17.1
On DARWIN, the file is slightly different.
openmpi: description: "Open MPI: Message-Passing Interface" url: http://www.open-mpi.org/ prefix: <OPENMPI-INSTALL-DIR> versions: "4.0.5:intel19,ls-dyna": description: Open MPI 4.0.5 for LS-DYNA, Intel 2019 dependencies: - intel/2019
With the necessary dependencies satisfied, a VALET package definition for LS-DYNA can be created. Assuming this is the first time LS-DYNA is being installed, the file will need to be created as it appears below; if versions/variants are added in the future, that file should be augmented with just the new version/variant information. The <LS_DYNA_PREFIX>
text must be replaced in the package definition with the value of that environment variable that was assigned at the beginning of this recipe:
$ echo $LS_DYNA_PREFIX /work/group/sw/ls-dyna
The <LSTC-LICENSE-SERVER-NAME>
text must be replaced with the IP address or DNS name of the computer that services the LSTC LS-DYNA license that was purchased.
ls-dyna: prefix: <LS_DYNA_PREFIX> url: http://www.lstc.com/ description: general-purpose finite element program simulating complex real world problems actions: - variable: LSTC_LICENSE action: set value: network - variable: LSTC_LICENSE_SERVER action: set value: <LSTC-LICENSE-SERVER-NAME> - variable: LSTC_INTERNAL_CLIENT action: set value: off - variable: LSTC_ROOT action: set value: ${VALET_PATH_PREFIX} - bindir: ${VALET_PATH_PREFIX} default-version: "15.0.2:double,hybrid,avx2" versions: "15.0.2:single,hybrid,avx2": description: 15.0.2, single-precision, Hybrid (SMP+MPP), AVX2 prefix: 15.0.2/avx2/hyb/single dependencies: - openmpi/4.0.5:intel19,ls-dyna "15.0.2:double,hybrid,avx2": description: 15.0.2, double-precision, Hybrid (SMP+MPP), AVX2 prefix: 15.0.2/avx2/hyb/double dependencies: - openmpi/4.0.5:intel19,ls-dyna
Loading one of these packages into the shell environment makes the ls-dyna
command (the symlink created during install) available:
$ vpkg_require ls-dyna/single,hybrid,avx2 Adding dependency `intel/2019u5` to your environment Adding dependency `libfabric/1.17.1` to your environment Adding dependency `openmpi/4.0.5:intel19,ls-dyna` to your environment Adding package `ls-dyna/15.0.2:single,avx2,hybrid` to your environment $ which ls-dyna /work/group/sw/ls-dyna/15.0.2/avx2/hyb/single/ls-dyna
On the Caviness/DARWIN cluster, LS-DYNA computation must be submitted as a Slurm job. The variants installed here use Open MPI, so the job script template at /opt/templates/slurm/generic/mpi/openmpi/openmpi.qs
is a starting point for LS-DYNA jobs. As in all cases for parallel jobs:
Follow the comments in the script header to determine which flags to alter and how to do so. The script template must be altered to add the desired version/variant of LS-DYNA to the job environment:
# # [EDIT] Do any pre-processing, staging, environment setup with VALET # or explicit changes to PATH, LD_LIBRARY_PATH, etc. # vpkg_require ls-dyna/15.0.2:single,hybrid,avx2
Toward the end of the job script template, the LS-DYNA program is run:
# # [EDIT] Execute your MPI program # ${UD_MPIRUN} ls-dyna i=my_model.k mpi_rc=$?