Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 40 Next »

PC² JupyterHub

The JupyterHub service is available for Noctua 1 and Noctua 2.

Access

The JupyterHub can be reached at the following address:

Noctua 2: https://jh.pc2.uni-paderborn.de

Noctua 1: https://jh.noctua1.pc2.uni-paderborn.de

The JupyterHub can be accessed via VPN or on-site at the University of Paderborn.

image-20240422-114054.png

Quick Start

Spawn host/resources

Start

Jupyter Session on Noctua 2

Quick Start

Jupyter Session on Noctua 1

Quick Start

Jupyter Notebook on Noctua 2

(Inside Slurm job, 1h runtime, normal partition)

Quick Start

Jupyter Notebook on Noctua 2

(Inside Slurm job, 1h runtime, gpu partition)

Quick Start

Jupyter Notebook on Noctua 1

(Inside Slurm job, 1h runtime, normal partition)

Quick Start

Jupyter Notebook on Noctua 1

(Inside Slurm job, 1h runtime, gpu partition - 1x A40)

Quick Start

Server Options

Presets

You have the possibility of creating a preset with predefined start options for yourself or your project group.

image-20240422-114414.png

Click here to list your presets: https://jh.pc2.uni-paderborn.de/services/presets/

Simple

image-20240422-114232.png

Preset enviroments with predefined values how to start the Jupyter Notebook.

Default and self-created apptainer containers can be used.

Advanced (Slurm)

An advanced view with setting options how a Slurm job should be started on a HPC cluster.

Loading additional Jupyter kernels

You can load additional Jupyter kernel using Lmod (module). Following kernel are currently available:

image-20240422-115034.png

If you need new kernel versions or even other programming languages then you are welcome to contact pc2-support!

Apptainer (Singularity) Container

In JupyterHub it is possible to launch Jupyter Notebook instances inside a Singularity container. This has the advantage of being able to use your own built environment. When starting a container, any directories can be mounted inside the container environment.

We provide a set of default Singularity containers:

Container name

Kernels available

Installed software

jupyter_scientific_python

Python

jupyter_datascience

Julia, Python, R

  • All from “jupyter_scientifc_python”

  • rpy2 package

  • The Julia compiler and base environment

  • IJulia to support Julia code in Jupyter notebooks

  • HDF5, Gadfly, RDatasets packages

To learn more about Singularity, see here: Singularity-Introduction

If you want to build your own Singularity container for JupyterHub, see here: https://upb-pc2.atlassian.net/wiki/spaces/PC2DOK/pages/1903131/JupyterHub#Create-my-own-Singularity-container

Remote Desktop (Graphical Environment via Xpra)

To create a remote desktop environment, you can click on "Desktop Environment" in the JupyterLab interface:

image-20240422-114609.png

When you click on the tile ‘Xpra Desktop’, a remote desktop environment is set up in the background. Graphical applications (e.g. loaded via modules) can be started from the started graphical terminal.

How-To

Create custom IPython kernel inside custom conda environment

  1. Create a conda environment as described here:

    1. Python & Python Package Management

  2. conda activate <your_conda_env>

  3. conda install ipykernel

  4. ipykernel install --user --name <KERNELNAME> --display-name "<DISPLAY NAME>"

Create my own Apptainer/Singularity container

Container package requirements

  • python >= 3.10

    • jupyterhub

    • optional, but useful: jupyterlab

Example Apptainer/Singularity recipe

Build containers: Apptainer 

Base recipe
Bootstrap: docker
From: debian

%post
apt -y update
export DEBIAN_FRONTEND=noninteractive
apt -y install zsh locales
localedef -i en_US -c -f UTF-8 -A /usr/share/locale/locale.alias en_US.UTF-8

python3 -m pip install jupyterhub
Install custom Python kernel inside the container (python 3.12)
mkdir /opt/python3.12
cd /opt/python3.12

apt -y install build-essential libssl-dev zlib1g-dev libbz2-dev libreadline-dev libsqlite3-dev wget curl llvm libncurses5-dev libncursesw5-dev xz-utils tk-dev libffi-dev liblzma-dev python3-openssl git

wget https://www.python.org/ftp/python/3.12.0/Python-3.12.0.tgz
tar -xf Python-3.12.0.tgz
rm Python-3.12.0.tgz
cd Python-3.12.0/
./configure --enable-optimizations
make -j 8
make altinstall

python3.12 --version
python3.12 -m pip install --upgrade pip
python3.12 -m pip install ipykernel

# finally installing ipython kernel
python3.12 -m ipykernel install --sys-prefix  --name <UNIQUE_KERNEL_NAME> --display-name "<KERNEL DISPLAY NAME>"
Install Lmod with the JupyterLab-Lmod extension
apt -y install lua5.3 lua-posix

mkdir -p /usr/lib64/lua/5.3
cp /usr/lib/x86_64-linux-gnu/liblua5.3-posix.so.1 /lib64/lua/5.3/
mv /lib64/lua/5.3/liblua5.3-posix.so.1 /lib64/lua/5.3/posix.so
Install Slurm Tools inside my container
groupadd --gid 351 munge
groupadd --gid 567 slurm
useradd -d /var/run/munge -M --gid 351 --uid 994 --shell /sbin/nologin munge
useradd -d /opt/software/slurm -M --gid 567 --uid 567 --shell /bin/false slurm

Container Location

All containers with type .sif will be automatically detected in $HOME/.jupyter/pc2-jupyterhub/

Your new built container can only be placed in your $HOME directory: $HOME/.jupyter/pc2-jupyterhub/

Alternatively you can create a link from your $PC2PFS to your $HOME directory:

ln -s /scratch/hpc-prf-project/jupyter_container.sif $HOME/.jupyter/pc2-jupyterhub/

View Slurm job logs

If the path of the Slurm Job output has not been changed explicity, it can be found here by default:

Noctua 1: $HOME/.jupyter/last_jh_noctua1.log

Noctua 2: $HOME/.jupyter/last_jh_noctua2.log

PC² Support

If you have any other problems that won’t be solved, please contact the pc2-support@uni-paderborn.de

  • No labels