- Current System Load - CPU, GPU
- Service Alerts
- Maintenance Sessions
- Previous Service Alerts
- System Status Mailings
- FAQ
- Usage statistics
Current System Load - CPU
The plot below shows the status of nodes on the current ARCHER2 Full System service. A description of each of the status types is provided below the plot.
- alloc: Nodes running user jobs
- idle: Nodes available for user jobs
- resv: Nodes in reservation and not available for standard user jobs
- plnd: Nodes are planned to be used for a future jobs. If pending jobs can fit in the space before the future job is due to start they can run on these nodes (often referred to as backfilling).
- down, drain, maint, drng, comp, boot: Nodes unavailable for user jobs
- mix: Nodes in multiple states
Note: the long running reservation visible in the plot corresponds to the short QoS which is used to support small, short jobs with fast turnaround time.
Current System Load - GPU
- alloc: Nodes running user jobs
- idle: Nodes available for user jobs
- resv: Nodes in reservation and not available for standard user jobs
- plnd: Nodes are planned to be used for a future jobs. If pending jobs can fit in the space before the future job is due to start they can run on these nodes (often referred to as backfilling).
- down, drain, maint, drng, comp, boot: Nodes unavailable for user jobs
- mix: Nodes in multiple states
Service Alerts
The ARCHER2 documentation also covers some Known Issues which users may encounter when using the system.
Status | Type | Start | End | Scope | User Impact | Reason |
---|---|---|---|---|---|---|
Planned | Service Alert | 2025-10-16 00:00 | Login node availability | There will be an interruption to login node availability of 5-10 minutes - time TBC | Login node OS update | |
Planned | Service Alert | 2025-10-16 00:00 | License server availability | There will be an interruption to license server availability of up to two hours - time TBC | License server work | |
Ongoing | Service Alert | 2025-10-07 09:00 | 2025-10-09 14:00 | ARCHER2 queues | Users may observe slightly longer queue times for other work while some nodes are reserved for the Capability QoS. |
ARCHER2 Capability Days The fifth ARCHER2 Capability Days session will run from 7-9 October 2025. |
Ongoing | Service Alert | 2025-10-03 17:30 | 2025-10-06 08:00 | ARCHER2 service | Service at higher risk of disruption than usual. If issues arise, service may take longer to restore. | Yellow weather warning for high winds in Edinburgh area leads to travel disruptions and a higher than usual risk of power/building damage issues. |
Ongoing | Issue | 2025-06-25 13:00 | Ongoing issues with /work1 file system | An issue with the /work1 file system is currently being investigated. This is only impacting some jobs and examples of errors seen include Cannot read/write checkpoint; corrupt file, or maybe you are out of disk space. | Under investigation |
Maintenance Sessions
This section lists recent and upcoming maintenance sessions. A full list of past maintenance sessions is available.
No scheduled or recent maintenance sessions
Previous Service Alerts
This section lists the five most recent resolved service alerts from the past 30 days. A full list of historical resolved service alerts is available.
Status | Type | Start | End | Scope | User Impact | Reason |
---|---|---|---|---|---|---|
Resolved | Issue | 2025-09-30 14:00 | 2025-09-30 14:30 | Slurm controller restart | Testing new configuration with power monitoring. Testing should take around 15-30mins to complete. Whilst this is happening users will be unable to submit jobs or query job status. | Add power monitoring functionality |
Resolved | Service Alert | 2025-09-25 08:30 | 2025-09-26 18:00 | Cooling system | Small risk of service interruption | Essential work on the cooling infrastructure that supports ARCHER2 |
Resolved | Service Alert | 2025-09-23 08:30 | 2025-09-27 08:30 | All parallel jobs launched using srun | All parallel jobs launched using `srun` will have their IO profile captured by the Darshan IO profiling tool. In rare cases this may cause jobs to fail or impact performance. Users can disable Darshan by adding the line `module remove darshan` before they use `srun` in their job submission scripts. | Capturing data on the IO use on ARCHER2 to improve the service. |
Resolved | Service Alert | 2025-09-17 14:00 | 2025-09-18 15:20 | RDFaaS | RDFaaS file systems unavailable (/epsrc, /general) | Underlying storage systems are being updated |
Resolved | Service Alert | 2025-09-17 14:00 | 2025-10-01 12:20 | GPU nodes | GPU nodes are unavailable | HPE are working to bring the GPU nodes back into service |
System Status mailings
If you would like to receive email notifications about system issues and outages, please subscribe to the System Status Notifications mailing list via SAFE
FAQ
Usage statistics
This section contains data on ARCHER2 usage for Sep 2025. Access to historical usage data is available at the end of the section.
Usage by job size and length
Queue length data
The colour indicates scheduling coefficient which is computed as [run time] divided by [run time + queue time]. A scheduling coefficient of 1 indicates that there was zero time queuing, a scheduling coefficient of 0.5 means that the job spent as long queuing as it did running.
Software usage data
Plot and table of % use and job step size statistics for different software on ARCHER2 for Sep 2025. This data is also available as a CSV file.
This table shows job step size statistics in cores weighted by usage, total number of job steps and percent usage broken down by different software for Sep 2025.
Software | Min | Q1 | Median | Q3 | Max | Jobs | Nodeh | PercentUse | Users | Projects |
---|---|---|---|---|---|---|---|---|---|---|
Overall | 0 | 512.0 | 2048.0 | 8192.0 | 259200 | 1407188 | 1885539.9 | 100.0 | 700 | 103 |
Unknown | 1 | 256.0 | 1000.0 | 3072.0 | 41472 | 866328 | 326105.9 | 17.3 | 369 | 75 |
Nektar++ | 1 | 8192.0 | 8192.0 | 23040.0 | 65536 | 263 | 318903.3 | 16.9 | 7 | 3 |
VASP | 1 | 384.0 | 768.0 | 2048.0 | 10240 | 46359 | 259874.7 | 13.8 | 110 | 13 |
OpenFOAM | 1 | 1008.0 | 4096.0 | 25600.0 | 32768 | 1992 | 141131.1 | 7.5 | 42 | 19 |
Met Office UM | 1 | 576.0 | 576.0 | 1296.0 | 6840 | 27750 | 109342.2 | 5.8 | 33 | 6 |
SENGA | 2500 | 8192.0 | 8192.0 | 33500.0 | 33500 | 50 | 89029.0 | 4.7 | 4 | 2 |
GROMACS | 1 | 256.0 | 512.0 | 768.0 | 6400 | 13852 | 81433.6 | 4.3 | 19 | 5 |
Python | 1 | 2304.0 | 2304.0 | 9216.0 | 32768 | 6363 | 71221.0 | 3.8 | 22 | 13 |
LAMMPS | 1 | 128.0 | 128.0 | 512.0 | 131072 | 15067 | 65629.4 | 3.5 | 28 | 12 |
No srun | 0 | 512.0 | 23040.0 | 62592.0 | 259200 | 79621 | 57187.1 | 3.0 | 459 | 82 |
GENE | 1 | 4096.0 | 8192.0 | 8192.0 | 12288 | 3244 | 44922.5 | 2.4 | 8 | 3 |
CASTEP | 10 | 512.0 | 576.0 | 2048.0 | 4096 | 10795 | 39706.3 | 2.1 | 25 | 6 |
CP2K | 1 | 256.0 | 576.0 | 1024.0 | 4096 | 13233 | 38488.1 | 2.0 | 32 | 9 |
FHI aims | 16 | 256.0 | 768.0 | 1280.0 | 8192 | 6778 | 35819.4 | 1.9 | 20 | 5 |
Nek5000 | 512 | 65536.0 | 65536.0 | 65536.0 | 65536 | 35 | 32218.8 | 1.7 | 4 | 2 |
Hydro3D | 48 | 36040.0 | 36040.0 | 36040.0 | 36040 | 70 | 24592.5 | 1.3 | 2 | 1 |
OpenSBLI | 1536 | 64000.0 | 64000.0 | 64000.0 | 131072 | 18 | 20679.2 | 1.1 | 6 | 3 |
NEMO | 1 | 1808.0 | 2924.0 | 2924.0 | 5504 | 1526 | 19690.8 | 1.0 | 14 | 3 |
MITgcm | 18 | 126.0 | 624.0 | 624.0 | 624 | 3572 | 17405.9 | 0.9 | 11 | 2 |
Code_Saturne | 4 | 768.0 | 2048.0 | 2048.0 | 4096 | 128 | 14529.6 | 0.8 | 4 | 2 |
ChemShell | 32 | 1024.0 | 1024.0 | 1024.0 | 2048 | 329 | 10966.7 | 0.6 | 11 | 4 |
a.out | 1 | 256.0 | 1152.0 | 1152.0 | 4096 | 350 | 10789.7 | 0.6 | 11 | 5 |
GS2 | 1280 | 2064.0 | 2560.0 | 2560.0 | 8256 | 4611 | 8944.8 | 0.5 | 5 | 2 |
Quantum Espresso | 16 | 512.0 | 1024.0 | 1536.0 | 1536 | 5328 | 7478.1 | 0.4 | 15 | 5 |
GAMESS | 128 | 128.0 | 128.0 | 128.0 | 128 | 117 | 6137.4 | 0.3 | 2 | 1 |
PeleLMeX | 128 | 1024.0 | 1152.0 | 2304.0 | 4608 | 111 | 5189.8 | 0.3 | 4 | 1 |
TPLS | 96 | 4096.0 | 4096.0 | 4096.0 | 4096 | 100 | 3727.8 | 0.2 | 2 | 1 |
RMT | 352 | 1664.0 | 1664.0 | 1664.0 | 2304 | 41 | 3548.9 | 0.2 | 3 | 1 |
ONETEP | 1 | 128.0 | 128.0 | 128.0 | 128 | 999 | 3299.1 | 0.2 | 2 | 1 |
Xcompact3d | 1920 | 1920.0 | 1920.0 | 2048.0 | 131072 | 9 | 3175.9 | 0.2 | 3 | 3 |
NWChem | 16 | 128.0 | 128.0 | 128.0 | 512 | 296474 | 2420.9 | 0.1 | 5 | 4 |
PDNS3D | 1024 | 1024.0 | 1024.0 | 16384.0 | 16384 | 33 | 2093.8 | 0.1 | 2 | 1 |
EPOCH | 256 | 4096.0 | 4096.0 | 4096.0 | 4096 | 81 | 1954.3 | 0.1 | 4 | 1 |
HemeLB | 256 | 1024.0 | 1024.0 | 1024.0 | 1280 | 10 | 1648.1 | 0.1 | 2 | 1 |
CRYSTAL | 64 | 128.0 | 128.0 | 512.0 | 512 | 607 | 1285.1 | 0.1 | 3 | 1 |
VAMPIRE | 128 | 1024.0 | 1024.0 | 1024.0 | 1024 | 82 | 1121.0 | 0.1 | 3 | 1 |
CESM | 1 | 128.0 | 128.0 | 1152.0 | 1920 | 146 | 1007.7 | 0.1 | 5 | 1 |
SBLI | 1024 | 1024.0 | 1024.0 | 1024.0 | 1024 | 127 | 780.8 | 0.0 | 1 | 1 |
NAMD | 32 | 256.0 | 256.0 | 512.0 | 512 | 37 | 631.9 | 0.0 | 3 | 3 |
ptau3d | 8 | 160.0 | 160.0 | 160.0 | 160 | 42 | 456.4 | 0.0 | 2 | 1 |
iIMB | 128 | 128.0 | 128.0 | 128.0 | 128 | 4 | 337.0 | 0.0 | 1 | 1 |
ABINIT | 64 | 128.0 | 128.0 | 128.0 | 256 | 40 | 226.9 | 0.0 | 1 | 1 |
DL_MESO | 128 | 256.0 | 256.0 | 256.0 | 256 | 14 | 206.3 | 0.0 | 1 | 1 |
SIESTA | 128 | 128.0 | 128.0 | 128.0 | 128 | 163 | 97.1 | 0.0 | 1 | 1 |
Smilei | 8 | 256.0 | 256.0 | 256.0 | 256 | 20 | 49.7 | 0.0 | 3 | 1 |
DL_POLY | 128 | 256.0 | 256.0 | 256.0 | 256 | 11 | 49.2 | 0.0 | 1 | 1 |
Arm Forge | 1 | 128.0 | 128.0 | 128.0 | 128 | 64 | 2.5 | 0.0 | 4 | 4 |
WRF | 2 | 2.0 | 2.0 | 2.0 | 36 | 17 | 1.6 | 0.0 | 1 | 1 |
ludwig | 2 | 8.0 | 16.0 | 16.0 | 16 | 168 | 1.2 | 0.0 | 2 | 2 |
FVCOM | 256 | 256.0 | 256.0 | 256.0 | 256 | 5 | 0.0 | 0.0 | 1 | 1 |
HYDRA | 1 | 1.0 | 1.0 | 1.0 | 1 | 4 | 0.0 | 0.0 | 2 | 2 |