technical:recipes:mpi4py-in-virtualenv

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
technical:recipes:mpi4py-in-virtualenv [2024-03-07 14:41] – [Building mpi4py] anitatechnical:recipes:mpi4py-in-virtualenv [2024-03-07 17:04] (current) – [Building mpi4py] anita
Line 15: Line 15:
  
 <WRAP center round info 60%> <WRAP center round info 60%>
-On Caviness and DARWIN we would likely choose the Intel Python distribution selection a Python 3 version over Anaconda (e.g. ''vpkg_versions intel-python'') since it automatically enables Intel's distribution channel.  That channel includes Numpy built against the Intel MKL library, for example, and other highly-optimized variants of computationally-intensive Python components.+On Caviness and DARWIN we would likely choose the Intel distribution selecting a Python 3 version over Anaconda (e.g. ''vpkg_versions intel-oneapi'' or ''intel-python'') since it automatically enables Intel's distribution channel.  That channel includes Numpy built against the Intel MKL library, for example, and other highly-optimized variants of computationally-intensive Python components.
  
 As of November 2020, the majority of packages populating the Intel channel require baseline operating system libraries (like ''glibc'') newer than what Farber provides:  a clear example of the binary compatibility issues that are present in conda software distribution. As of November 2020, the majority of packages populating the Intel channel require baseline operating system libraries (like ''glibc'') newer than what Farber provides:  a clear example of the binary compatibility issues that are present in conda software distribution.
Line 171: Line 171:
 ===== Caviness ===== ===== Caviness =====
  
-The steps for completing this work on Caviness are similar to those presented for Farber.  We will instead use the Intel Python distribution:+The steps for completing this work on Caviness are similar to those presented for Farber and of course following the first part to [[technical:recipes:mpi4py-in-virtualenv#create-a-directory-hierarchy|create a directory hierarchy]].  We will instead use the Intel Python distribution:
  
 <code bash> <code bash>
Line 296: Line 296:
 ===== DARWIN ===== ===== DARWIN =====
  
-The steps for completing this work on DARWIN are similar to those presented for Caviness and of course following the first part of the directory structure setup presented for Farber.  We will instead use the Intel OneAPI Python distribution:+The steps for completing this work on DARWIN are similar to those presented for Caviness and of course following the first part to [[technical:recipes:mpi4py-in-virtualenv#create-a-directory-hierarchy|create a directory hierarchy]].  We will instead use the Intel oneAPI Python distribution:
  
 <code bash> <code bash>
-$ vpkg_require openmpi/4.1.2 intel-oneapi/2024 +$ vpkg_require openmpi/5.0.2:intel-oneapi-2024 intel-oneapi/2024
-Adding package `openmpi/4.1.2` to your environment+
 Adding dependency `gcc/12.2.0` to your environment Adding dependency `gcc/12.2.0` to your environment
 +Adding dependency `intel-oneapi/2024.0.1.46` to your environment
 +Adding dependency `ucx/1.13.1` to your environment
 +Adding package `openmpi/5.0.2:intel-oneapi-2024` to your environment
 </code> </code>
  
Line 354: Line 356:
 </code> </code>
  
-The ''--no-binary :all:'' flag prohibits the installation of any packages that include binary components, effectively forcing a rebuild of mpi4py from source.  The ''--compile'' flag pre-processes all Python scripts in the mpi4py package (versus allowing them to be processed and cached later).  The environment now includes support for mpi4py linked against the ''openmpi/4.0.2'' library on Caviness:+The ''--no-binary :all:'' flag prohibits the installation of any packages that include binary components, effectively forcing a rebuild of mpi4py from source.  The ''--compile'' flag pre-processes all Python scripts in the mpi4py package (versus allowing them to be processed and cached later).  The environment now includes support for mpi4py linked against the ''''openmpi/5.0.2:intel-oneapi-2024'''' library on DARWIN:
  
 <code bash> <code bash>
Line 388: Line 390:
           success: 0           success: 0
     versions:     versions:
-          "20240305": +          "20240307": 
-              description: environment built Mar 5, 2024+              description: environment built Mar 7, 2024
               dependencies:               dependencies:
-                  - openmpi/4.1.2 +                  - openmpi/5.0.2:intel-oneapi-2024 
-                  - intel-python/2022+                  - intel-oneapi/2024
 </code> </code>
  
Line 406: Line 408:
 [/home/1006/.valet/my-sci-app.vpkg_yaml] [/home/1006/.valet/my-sci-app.vpkg_yaml]
 my-sci-app  Some scientific app project in Python my-sci-app  Some scientific app project in Python
-20240305  environment built Mar 5, 2024+20240307  environment built Mar 7, 2024
 </code> </code>
  
Line 412: Line 414:
  
 <code bash> <code bash>
-$ vpkg_require my-sci-app/20240305 +$ vpkg_require my-sci-app/20240307 
-Adding dependency `openmpi/4.1.2` to your environment +Adding dependency `gcc/12.2.0` to your environment 
-Adding dependency `intel-python/2022.1.0` to your environment +Adding dependency `intel-oneapi/2024.0.1.46` to your environment 
-Adding package `my-sci-app/20240305` to your environment +Adding dependency `ucx/1.13.1` to your environment 
-(/home/1006/conda-envs/my-sci-app/20240305)$ which python3+Adding dependency `openmpi/5.0.2:intel-oneapi-2024` to your environment 
 +Adding package `my-sci-app/20240307` to your environment 
 +(/home/1006/conda-envs/my-sci-app/20240307)$ which python3
 ~/conda-envs/my-sci-app/20240305/bin/python3 ~/conda-envs/my-sci-app/20240305/bin/python3
 (/home/1006/conda-envs/my-sci-app/20240305)$ pip list | grep mpi4py (/home/1006/conda-envs/my-sci-app/20240305)$ pip list | grep mpi4py
 mpi4py             3.1.5 mpi4py             3.1.5
 (/home/1006/conda-envs/my-sci-app/20240305)$ which mpirun (/home/1006/conda-envs/my-sci-app/20240305)$ which mpirun
-/opt/shared/openmpi/4.1.2/bin/mpirun+/opt/shared/openmpi/5.0.2-intel-oneapi-2024/bin/mpirun
 </code> </code>
  
  • technical/recipes/mpi4py-in-virtualenv.1709840512.txt.gz
  • Last modified: 2024-03-07 14:41
  • by anita