Partitions
On PC2 cluster system partitions are used to distinguish compute nodes with different hardware. That means that there is no partition with a higher priority or related things. Priorities are handled by Quality-of-Service instead.
Partitions on Noctua 1
The Noctua 1 cluster is composed of:
Node Spec | Partition | Job Time Limit | Naming | Count | CPU | Highest Instruction set | Sockets | Cores | SMT | Main Memory | Interconnect |
---|---|---|---|---|---|---|---|---|---|---|---|
login nodes |
| ln-000[1-2] | 2 | AVX-512 | 2 | 2x20 | off | 192 GB DDR4 | Intel Omni Path 100 Gbps, 1:1.4 blocking factor | ||
normal nodes | normal | 21 days | cn-[0001-0256] | 256 | |||||||
GPU-nodes | gpu | 7 days | gpu-[0001-0018] | 36 (2 GPUs per Node) |
Partitions on Noctua 2
The Noctua 2 cluster is composed of:
Node Spec | Partition | Job Time Limit | Naming | Node Count | CPU | Highest Instruction set | Sockets | Cores | SMT | Main Memory in GB of DDR4 | Accelerators | Node-local Storage | Interconnect |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
login nodes | n2login[1-6] | 6 | AVX2 | 1 | 64 | on | 512 | Infiniband HDR 100 | |||||
normal nodes | normal | 21 days | n2cn[01-11][01-96] | 990 | 2 | 2x64 | off | 256 (240 useable) | |||||
large-memory nodes | largemem | 21 days | n2lcn01[01-66] | 66 | 1024 (950 useable) | ||||||||
hugemem | 21 days | n2hcn01[01-05] | 5 | 2048 (1900 usable) | 12x 3 TB NVME SSDs | ||||||||
GPU-nodes | gpu | 7 days | n2gpu12[01-32] | 32 | 512 (485 useable) | 2xInfiniband HDR 200 | |||||||
DGX A100 | dgx | n2dgx01 | 1 | AMD Rome 7742 | 1024 (950 useable) | 4x 3,84 TB NVME SSDs | 4x Infiniband HDR 200 | ||||||
FPGA-nodes with Xilinx FPGAs | fpga | 7 days | n2fpga[01-16] | 16 | 512 (485 useable) | 3x Xilinx Alveo U280 cards | Infiniband HDR 100 | ||||||
FPGA-nodes with Intel FPGAs | 7 days | n2fpga[18-34] | 16 | 2x Bittware 520N cards | |||||||||
FPGA-nodes with custom configurations | 7 days | n2fpga17, n2fpga[35,36] | 3 |