Containers at NHR

From HPC Wiki
Revision as of 16:05, 5 November 2024 by Benjamin-juhl-d3d3@tu-darmstadt.de (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search


General

Many users (especially new users) want to run docker containers on the NHR infratructure. However the NHR centers provide singularity/apptainer as a container framework, and these cannot run docker containers natively. Singularity and apptainer commands can be used mostly interchangeably.



Converting Containers

While it is not possible to run docker containers directly with the singularity/apptainer framework, there are several methods to convert a docker container into another format (e.g. sif or oci), and then use these converted containers with apptainer.

Build a sif container from a container in a docker registry

One can use the singualrity build command to build a singularity container directly from a container available from a docker repository.

Example:

singularity build lolcow.sif docker://godlovedc/lolcow
singularity run lolcow.sif

Build a sif container from a docker container archive file

If you have a localy developed docker container, or a container from a registry with local changes, you can save it as a docker archive. You can then use singualrity build to build a singularity container from the archive.

Example:

singularity build lolcow_from_archive.sif docker-archive://$(pwd)/lolcow_docker_archive.tar
singularity run lolcow_from_archive.sif

Use spython to translate a docker recipe into a singularity recipe and build from that recipe

In same cases, there might only be a docker recipe (also called a docker file) available, which is usualy used to build a docker container localy. Apptainer and singularity can also use such recipes to build containers, but the format is different. The singularity project provides a tool (spython ) to translate a docker recipe into a singularity recipe.

Example:

git clone https://github.com/GodloveD/lolcow
spython recipe lolcow/Dockerfile ./sing.lolcow
singularity build --fakeroot lolcow_from_dockerfile.sif sing.lolcow
singularity run lolcow_from_dockerfile.sif

If spython is not natively available on the HPC system, you often install it in a virtual environment:

python -m venv myenv
source myenv/bin/activate
pip install --upgrade pip
pip install spython
deactivate


Running MPI Containers

There are two way how to run mpi applications via containers. One method is to call mpi externally, and basically use the container instead of an mpi application. Another method is to call mpi within the container.

Calling MPI externally

In this approach the containerized application is called by the external MPI from the host. The host MPI takes care of the initialization. The MPI library within the container that the application was built with has to be compatible (ideally same version, this could also be achieved by binding the external library into the container) and the connects back to the external call.

Usining the internal MPI

Work in progress


NHR Container Repository

NHR@Göttingen has made a container repository available for thecontainers provided by all NHR centers. The containers there can be easily used, and perform well on the spezified hardware.

apptainer run --nv \
docker://docker.gitlab-ce.gwdg.de/nhr-container/container-repository/gwdg/gwdg-grete.allgpu-app\_mpi:latest

INFO:    Using cached SIF image
Apptainer> nvidia-smi
...
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.104.12             Driver Version: 535.104.12   CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
|=========================================+======================+======================|
|   0  NVIDIA A100-SXM4-40GB          On  | 00000000:AF:00.0 Off |                  Off |
...
Apptainer> spack --version
m0.20.1
Apptainer> mpirun --version
mpirun (Open MPI) 4.1.5