Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Auto-loading the

...

JuliaHPC module

...

On Noctua, the julia executable is only available after loading a Julia module (see Julia: Getting Started). To use the Julia VS Code extension within a VS Code SSH remote session you must make sure that a Julia module is automatically loaded when the Julia Language Server starts (i.e. when opening or creating a Julia file) or when you open the Julia REPL (i.e. Julia: Start REPL). This can be done by pointing the extension to a wrapper script which loads the module and then starts julia.

We provide a default wrapper script under /opt/software/pc2/julia/julia_vscode on Noctua 2 (/cm/shared/apps/pc2/julia/julia_vscode on Noctua 1) and you should point the Julia VS Code extension to this wrapper by specifying the Julia: Executable Path (julia.Specifically, create a file julia_wrapper.sh with the following content.executablePath) setting.

Note that the default wrapper above automatically loads the latest version of the JuliaHPC module and, correspondingly, the latest Julia version. If you want to use a specific version, you may point the Julia VS Code extension to version specific wrapper scripts that we provide in the root directory of the module (given by the environment variable $EBROOTJULIAHPC after you’ve loaded the module). Example: /opt/software/pc2/EB-SW/software/JuliaHPC/1.8.2-foss-2022a-CUDA-11.7.0/julia_vscode on Noctua 2.

Jupyter notebooks (in VSCode)

  • Open a terminal and ssh into Noctua 2.

  • Load the Julia or JuliaHPC module that you want to use, e.g. module load lang/JuliaHPC/1.8.5-foss-2022a-CUDA-11.7.0.

  • Start julia and install the IJulia package into the default, global environment (i.e. v1.8 in this example case).

  • Afterwards, execute using IJulia and use IJulia.installkernel("Noctua 2 JuliaHPC").

  • Finally, open VS Code, open a remote session on Noctua 2, and select your custom IJulia kernel when running Jupyter notebook files.

Note: If you plan to use your custom IJulia kernel in our centrally hosted JupyterHub service, use IJulia.installkernel("Noctua 2 JuliaHPC"; env=Dict("JULIA_DEPOT_PATH"=>ENV["JULIA_DEPOT_PATH"])).

Julia wrapper: manual approach (not recommended!)

An exemplary julia wrapper script on Noctua 2:

Code Block
languagebash
#!/bin/bash
# ------------------------------------------------------------
export MODULEPATH=/etc/modulefiles:/usr/share/modulefiles || :
source /usr/share/lmod/lmod/init/profile
if [ -f "/opt/software/pc2/lmod/modules/DefaultModules.lua" ];then
        export MODULEPATH="$MODULEPATH:/opt/software/pc2/lmod/modules"
        export LMOD_SYSTEM_DEFAULT_MODULES="DefaultModules"
else
        if [ -f "/usr/share/modulefiles/StdEnv.lua" ];then
                export LMOD_SYSTEM_DEFAULT_MODULES="StdEnv"
        fi
fi
module --initial_load restore
# ------------------------------------------------------------

module load lang
module load JuliaHPC # or module load Julia

exec julia "${@}"

Afterwards, make the wrapper executable (i.e. via chmod u+x julia_wrapper.sh) and make the “Executable Path” "Executable Path" setting of the Julia extension (julia.executablePath) point to this file. (Note: The first block makes the module command available.)

Using a direnv environment with the integrated Julia REPL

Modify the script above to the following:

...

This will load the direnv environment when starting the integrated Julia REPL (and only the JuliaHPC module when starting the Julia Language Server).

Noctua 1

You can same approach as above but the module related paths are different. Specifically, on Noctua 1 the module part should be

Code Block
languagebash
# ------------------------------------------------------------
export MODULEPATH=/cm/shared/apps/pc2/lmod/modules:/cm/shared/apps/pc2/EB-SW/modules/all || :
source /usr/share/lmod/lmod/init/profile
if [ -f "/cm/shared/apps/pc2/lmod/modules/DefaultModules.lua" ];then
        export LMOD_SYSTEM_DEFAULT_MODULES="DefaultModules"
else
        if [ -f "/usr/share/modulefiles/StdEnv.lua" ];then
                export LMOD_SYSTEM_DEFAULT_MODULES="StdEnv"
        fi
fi
module --initial_load restore
# ------------------------------------------------------------

VS Code on compute nodes

We recommend the following two-step process

...

First, open a terminal, login to the cluster and request an interactive session on one of the compute nodes.

  • Remember the name of the compute node that was assigned to you, e.g. n2cn1234.

  • Keep the terminal open until you’re done with your work.

Second, use VS Code’s remote extension to connect to the compute node via SSH.

...

For this to work, you need to be able to directly ssh n2cn1234 to the compute node. To avoid many entires in your ~/.ssh/config (one for each compute node) you can use the following entries for Noctua 1 and 2 based on wildcards (the jump hosts are defined here):

...

-

...

-

...

-

...

-

...