Euler Cheat Sheet
November 2025, Nik Zemp, GDC
🔑 Key resources to consider
- 📁 File System
- 🔋 Memory (RAM)
--mem-per-cpu - ⚙️ CPUs
--cpus-per-task - ⏱️ Runtime
--time
🔐 Login
ssh <USER>@euler.ethz.ch
📦 Software Stack
# Source GDC stack
source /cluster/project/gdc/shared/stack/GDCstack.sh
# Show all available tools
module avail
# Load a module
module load <name>
📂 File System
- Use your
$HOMEfor scripts and own installation. - Always use
$SCRATCHor$TMPDIRfor processing your data. - Copy only final data to the GDC home and GDC projects for save keeping.
- Archive many small files using tar or zip.
- Compress large outputs.
- Storage is charged annually.
# Show folder size of mapping
du -sh --si mapping
# Count files in raw_data
find raw_data | wc -l
# Archive folder data1
tar cvzf data1.tar.gz data1
# List archive contents
tar ztvf data1.tar.gz
# Extract all
tar xvf data1.tar.gz
# Extract specific folder
tar xvf data1.tar.gz data1/raw
#Data transfer
scp -r data <USER>@euler.ethz.ch:/cluster/scratch/<USER>
scp -r <USER>@euler.ethz.ch:/cluster/scratch/<USER>/data ./
# Generate checksums
md5sum *fq.gz > md5sums.txt
# Verify checksums
md5sum --check md5sums.txt
🚀Job Submission
### Submit a job
sbatch < submit.tool.slurm.sh
### Overview of the submitted jobs
jview
### Kill specific job
scancel <Job-ID>/<Array-ID>
### Kill all running jobs
scancel --user=$USER
Interaktive Job
srun --pty bash
Submission script
#!/bin/bash
#SBATCH --job-name=tool #Name
#SBATCH --array=1-10%4 #Array length 1-10, 4 at the time
#SBATCH --ntasks=1 #Node (always 1)
#SBATCH --cpus-per-task=2 #2 CPUs
#SBATCH --mem-per-cpu=2G #Memory per 2 CPU; 4G in total
#SBATCH --time=4:00:00 #Run time
#SBATCH --output=tool_%a.log #Log-file
source /cluster/project/gdc/shared/stack/GDCstack.sh
module load <name>
## Array variable (e.g. 1-10) and extract sample names from the index
IDX=${SLURM_ARRAY_TASK_ID}
SAMPLE=$(sed -n ${IDX}p sample.lst)
## Run the array e.g. SAMPLE 1-10
tool -in ${SAMPLE}.in -out ${SAMPLE}.out
👀Job Monitoring
Running jobs
jeffrun -r
jeffrun -j <Job-ID>/<Array-ID>
Finished jobs
## Efficency of specific job
jeff <Job-ID>/<Array-ID>
## Get an overview about the last 24 hours
jeff24
## Get an overview about the last week
jefflow
📊 WebGUI