Modules and applications
On the cluster, we provide many centrally installed software and for some software even multiple versions. To configure the environment for a particular software version, we use modules. Modules configure your current computing environment (PATH, LD_LIBRARY_PATH, MANPATH, etc.) to make sure all required binaries and libraries are found. We employ two types of Modules packages on the cluster:
|
Modules commands
Here are the most frequently used module commands.
List all modules that match the given module name, e.g., list all available Python
$ module avail python
Load modules, e.g., load GCC 6.3.0 and Python 3.8.5
$ module load gcc/6.3.0 python/3.8.5
List all currently loaded modules
$ module list Currently Loaded Modules: 1) StdEnv 2) gcc/6.3.0 3) openblas/0.2.20 4) python/3.8.5
Other commonly used module commands
module # get info about module sub-commands module avail # List all modules available on the cluster module key keyword # list all modules whose description contains keyword module help name # get information about module name module show name # show what module name does (without loading it) module unload name # unload module name module purge # unload all modules at once
Where are applications installed?
Centrally installedCentral applications are installed in /cluster/apps Applications that are needed by many users should be installed centrally, like compilers and libraries
|
In $HOMEUsers can install additional applications in their home directory, but only if the quotas (space: 16GB, files/directories: 100'000) are not exceeded
$ pip install --user packagename |
Two software stacks on Euler
Old software stack with Env ModulesUpon your login on Euler, the old software stack is set by default. |
New software stack with LMOD ModulesSwitch from the new to old software stack: |
|
$ env2lmod |
$ lmod2env |
You can set the default software stack to the new one with the command:
$ set_software_stack.sh new
Please note that after setting the default software stack, you need to logout and login again to make the change becoming active.
All new software is installed exclusively in the new software stack, mostly done with SPACK package manager.
The structure of LMOD Modules
LMOD Modules use a hierarchy of modules with three layers to avoid conflicts when multiple modules are loaded at the same time.
$ module load comsol/5.6
$ module load gcc/6.3.0 hdf5/1.10.1
$ module load gcc/6.3.0 openmpi/3.0.1 openblas |
There are four main toolchains
Those compilers can be combined with OpenMPI 3.0.1 or 4.0.2 and OpenBLAS |
Modules for GPUs
In the new software stack, we have installed a Python module which is linked with CUDA, cuDNN and NCCL libraries and contains machine learning / deep learning packages such as scikit-Learn, TensorFlow and Pytorch. This module can be loaded with the following commands:
$ env2lmod $ module load gcc/6.3.0 python_gpu/3.8.5 The following have been reloaded with a version change: 1) gcc/4.8.5 => gcc/6.3.0 $ module list Currently Loaded Modules: 1) StdEnv 4) cuda/11.0.3 7) python_gpu/3.8.5 2) gcc/6.3.0 5) cudnn/8.0.5 3) openblas/0.2.20 6) nccl/2.7.8-1
You can also find and load CUDA, cuDNN and NCCL libraries available on the cluster matching your needs.
$ env2lmod $ module load gcc/6.3.0 $ module avail cuda ------------- /cluster/apps/lmodules/Compiler/gcc/6.3.0 ------------- cuda/8.0.61 cuda/9.2.88 cuda/11.0.3 (L) cuda/9.0.176 cuda/10.0.130 cuda/11.1.1 cuda/9.1.85 cuda/10.1.243 cuda/11.2.2 (D)
Application lists
Bioinformatics & life science | Finite element methods | Machine learning | Multi-physics phenomena | Quantum chemistry & molecular dynamics | Symbolic, numerical and statistical mathematics | Visualization |
---|---|---|---|---|---|---|
Bamtools BLAST Bowtie CLC Genomics Server RAxML Relion TopHat |
Ansys Abaqus FEniCS |
PyTorch Scikit-Learn TensorFlow Theano |
AnsysEDT COMSOL Multiphysics STAR-CCM+ |
ADF Ambertools Gaussian Molcas Octopus Orca Qbox Turbomole |
Gurobi Mathematica MATLAB R Stata |
Ffmpeg ParaView VisIT VTK |
Compiler | Programming languages | Scientific libraries | Solvers | MPI libraries | Build systems | Version control |
---|---|---|---|---|---|---|
GCC Intel |
C, C++ Fortran, Go Java, Julia Perl, Python Ruby, Scala |
Boost, Eigen FFTW, GMP GSL, HDF5 MKL, NetCDF NumPy, OpenBLAS SciPy |
PETSc Gurobi Hypre Trilinos |
Open MPI Intel MPI MPICH |
GNU Autotools Cmake qmake make |
CVS Git Mercurial SVN |
Example
Further reading
- User guide: Setting up your environment
- Setting up a software stack for a research group
- Creating a local module directory
- Unpacking RPM packages in users space