Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 29 Next »

PC² JupyterHub

The JupyterHub service is available for Noctua 1 and Noctua 2.

Access

The JupyterHub can be reached at the following address:

Noctua 1: https://jh.noctua1.pc2.uni-paderborn.de

Noctua 2: https://jh.pc2.uni-paderborn.de

The JupyterHub can be accessed via VPN or on-site at the University of Paderborn.

Quick Start

Spawn host/resources

Start

Jupyter Session on Noctua 2

Quick Start

Jupyter Session on Noctua 1

Quick Start

Jupyter Notebook on Noctua 2

(1h runtime, normal partition)

Quick Start

Jupyter Notebook on Noctua 2

(1h runtime, gpu partition)

Quick Start

Jupyter Notebook on Noctua 1

(1h runtime, normal partition)

Quick Start

Jupyter Notebook on Noctua 1

(1h runtime, gpu partition - 1x A40)

Quick Start

Server Options

Simple

Pre-set enviroments with predefined values how to start the Jupyter Notebook.

Default and self-created singularity containers can be used.

Advanced (Slurm)

An advanced view with setting options how a Slurm job should be startet on a cluster.

Expert (Slurm)

An expert view with a free text field where you can load additional Slurm flags or load custom environments.

Singularity Container

In JupyterHub it is possible to launch Jupyter Notebook instances inside a Singularity container. This has the advantage of being able to use your own built environment. When starting a container, any directories can be mounted inside the container environment.

We provide a set of default Singularity containers:

Container name

Kernels available

Installed software

jupyter_scientific_python

Python

jupyter_datascience

Julia, Python, R

  • All from “jupyter_scientifc_python”

  • rpy2 package

  • The Julia compiler and base environment

  • IJulia to support Julia code in Jupyter notebooks

  • HDF5, Gadfly, RDatasets packages

To learn more about Singularity, see here: Singularity-Introduction

If you want to build your own Singularity container for JupyterHub, see here: Create my own Singularity container

Remote Desktop (Graphical Environment via noVNC)

To create a remote desktop environment, you can click on "Desktop Environment" in the JupyterLab interface:

The Remote Desktop feature is available for local running notebooks, Noctua 1 (Slurm jobs) and Noctua 2 (Slurm jobs) instances.

How-To

Loading software modules using JupyterLab

To load software modules inside JupyterLab, click on the Lmod extension tab. Then you have the possibility to search, load and unload modules.

If you are using the Classic Notebook View, click on tab "Softwares" to load software modules.

Create my own Singularity container

Installing Jupyter tools

You do not need to install the Jupyter client tools inside your Singularity container.

If the file /opt/conda/bin/jupyterhub-singleuser does not exists inside your container, the JupyterHub binds its own tools inside your container at run time.

If you want to manage your own Jupyter tools/extensions please make sure /opt/conda/bin/jupyterhub-singleuser exists inside your Singularity container.

Using Docker stacks

It is possible to build singularity containers from the official jupyter docker stacks:

https://jupyter-docker-stacks.readthedocs.io/en/latest/

Here are more information on how to build a singularity container from DockerHub:

https://sylabs.io/guides/3.7/user-guide/build_a_container.html

Container Location

Your new created container can only be placed in your $HOME directory: $HOME/.jupyter/pc2-jupyterhub/

Alternatively you can create a link from your $PC2PFS to your $HOME directory:

$ ls -l /scratch/pc2-mitarbeiter/mawi/jupyter_container.sif
  -rw-r--r--. 1 mawi pc2-mitarbeiter 0 Dec 17 07:53 /scratch/pc2-mitarbeiter/mawi/jupyter_container.sif
$ ln -s /scratch/pc2-mitarbeiter/mawi/jupyter_container.sif $HOME/.jupyter/pc2-jupyterhub/

All containers with type .sif will be automatically detected in $HOME/.jupyter/pc2-jupyterhub/

Troubleshooting

View Slurm job logs

If the path of the Slurm Job output has not been changed explicity, it can be found here by default:

Noctua 1: $HOME/.jupyter/last_jh_noctua1.log

Noctua 2: $HOME/.jupyter/last_jh_noctua2.log

"HubAuth._api_request" was never awaited

This is a current version conflict due to a feature change within JupyterHub.

For more information see here: https://github.com/jupyterhub/batchspawner/pull/247

We are waiting for a pull request.

“Terminals unavailable”

If you have terminado installed in your $HOME directory (pip3 install --user), please make sure that the version of terminado is at least 0.8.3.

PC² Support

If you have any other problems that won’t be solved, please contact the pc2-support@uni-paderborn.de

  • No labels