Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 9 Next »

Submitting a Python Job to the Noctua2 GPU Partition

Step 1: Setup the Python environment

Load Python Module, e.g:

module load lang/Python/lang/Python/3.10.4-GCCcore-11.3.0

Change the python package installation path to your scratch folder, if not already specified in your .bashrc (see Python for more details)

export PYTHONUSERBASE=/scratch/<PROJECT_NAME>/<USER>/.local
export PATH=/scratch/<PROJECT_NAME>/<USER>/.local/bin:$PATH

(Don’t forget to replace <PROJECT_NAME> and <USER> with your project name and username)

Install pytorch and numpy to your local enrivonment:

pip install -U numpy torch

Step 2: Create your Python script

Create a Python Script which performs some pytorch operations on the GPU, e.g:

import torch

use_cuda = torch.cuda.is_available()
device = torch.device("cuda" if use_cuda else "cpu")


print("Using device:", device)

A = torch.randn(500,400).to(device)
B = torch.randn(400,200).to(device)

C = torch.sum(torch.mm(A,B)).item()

print("Result:", C)

Save it in for example /scratch/<PROJECT_NAME>/<USER>/pytorch_test_project/test.py

Step 3: Create a jobscript and submit it to the Slurm queue

Write a jobscript in order to run your python script on a gpu compute node. Here is an example for a jobscript on Noctua2:

#!/bin/bash

#SBATCH --job-name="test job"
#SBATCH --time=00:05:00
#SBATCH --gres=gpu:a100:1 
#SBATCH --partition=gpu

module load lang
module load Python/3.10.4-GCCcore-11.3.0

export PYTHONUSERBASE=/scratch/<PROJECT_NAME>/<USER>/.local
export PATH=/scratch/<PROJECT_NAME>/<USER>/.local/bin:$PATH

python test.py

(Don’t forget to replace <PROJECT_NAME> and <USER> with your project name and username)

Save it in the same folder where you python script is located, e.g. /scratch/<PROJECT_NAME>/<USER>/pytorch_test_project/jobscript.sh

Submit the Jobscript by calling:

sbatch jobscript.sh

Using GPUs interactively for development/testing

If you would like to use GPUs for development/testing, it is recommended to do that within an interactive Slurm session on the DGX. More information on that can be found here https://uni-paderborn.atlassian.net/wiki/spaces/PC2DOK/pages/12944324/Running+Compute+Jobs#Using-GPUs-for-Development-and-Testing-Purposes .

Please keep in mind that you also need to setup the Python environment in an interactive session as written in step 1 (see above).

  • No labels