Dark Energy Center (DEC)¶
Hosted at CPPM, the DEC (Dark Energy Center) cluster is a high-performance computing (HPC) infrastructure integrated within the DANCE platform.
It is designed to support data-intensive and compute-intensive research activities across a range of scientific domains.
Overview¶
The DEC cluster consists of 30 hyper-threaded machines interconnected via three types of networks: - 1 Gb/s Ethernet - 10 Gb/s Ethernet - 40 Gb/s InfiniBand
Key resources: - 812 physical cores (1,624 threads) - 15 TB RAM - 300 TB disk storage
Cluster Nodes¶
mardec — Master Node¶
- Cores: 20 (40 threads)
- Memory: 250 GB
- Use: Entry point for users — submit jobs, compile code, manage data.
- ⚠️ Do not run jobs directly on this node.
mardec00 — High-Memory Node¶
- Cores: 28 (56 threads)
- Memory: 1.5 TB
- Note: Slightly slower (~10%) than other nodes. Best used standalone.
mardec01 to mardec28 — Compute Nodes¶
- Each node:
- 28 cores (56 threads)
- 512 GB RAM
- 300 GB local disk (
/data)
Storage Layout¶
| Mount Point | Size | Use Case | Notes |
|---|---|---|---|
/softdec |
100 GB | Shared software/tools | Part of main 300 TB RAID-4 disk |
/datadec |
300 TB | Active project data | Not backed up |
/roofdec |
100 TB | Collaborative data storage | For large, active datasets |
/loftdec |
100 TB | Archiving old/completed project data | Long-term, infrequent access |
/data |
860 GB | Local to compute nodes (not RAIDed) | Temp job-specific files |
/scratch |
7 TB | Shared GlusterFS across nodes | For collaborative/temp use |
Note:
/scratchis distributed over/dataon several nodes and replicated across 3 hosts. It's ideal for intermediate data exchange between jobs, but not for persistent storage.
Network Configuration¶
1G Ethernet¶
- Node names:
mardecXX - Uses: SSH access, job management, system administration
10G Ethernet¶
- Node names:
gigadecXX - IPs:
10.0.0.XX - Uses: Fast file access, shared storage communication
InfiniBand 40G¶
- Node names:
infinidecXX - IPs:
10.0.1.XX - Uses: Ultra-fast communication for parallel tasks, simulations, memory-bound jobs
More Information¶
For detailed documentation on the DEC cluster, visit the DEC user documentation.