Filter news by tags : blog  calls  events  newsletters  show all 

ARCHER2 Calendar March 2024

Dance with Fire Dr Jian Fang, Scientific Computing Department, STFC Daresbury Laboratory This video illustrates the interaction between turbulence and a hydrogen flame. Turbulence fluctuations disrupt the combustion, by stretching and bending the flame surface and altering the chemical reaction inside the reaction zone. In the meantime, the high temperature...

ARCHER2 Calendar February 2024

Unsteady Flow Over Random Urban-like Obstacles: A Large Eddy Simulation using uDALES 2.0 Codebase Dr Dipanjan Majumdar, Imperial College London, Civil and Environmental Engineering Department uDALES stands as an open-source large-eddy simulation framework, specifically designed to tackle unsteady flows within the built environment. It is capable of simulating airflow, sensible...

ARCHER2 Calendar January 2024

Wishing everyone in the ARCHER2 community a wonderful 2024 Modelling proton tunnelling in DNA replication Max Winokan, University of Surrey, Quantum Biology DTC * * * Winning Image and overall competition winning entry, ARCHER2 Image and Video Competition 2023 * * * Proton transfer between the DNA bases can lead...

ARCHER2 Festive Period

The ARCHER2 Service will observe the UK Public holidays and will be closed for Christmas Day, Boxing Day and New Year’s Day. We fully expect the ARCHER2 service to be available during these holidays but we will not have staff on the Service Desk to deal with any queries or...

ARCHER2 Calendar December 2023

Axial velocity contours showing wakes from stator and rotor blades of a DLR research compressor Dr Arun Prabhakar, University of Warwick, Department of Computer Science The visualization is produced from a simulation of a research compressor, RIG250 from the German Aerospace Centre(DLR), using a mesh consisting of 4.58 billion elements....

HPC Birds of a Feather at RSECon23

After the successful HPC and RSE workshop “Excalibur RSEs meet HPC Champions” which ran as a satellite event on the Friday after RSECon22 in Newcastle, the organisers decided to run a similar event this year. Read the blog post from Marion, Ed and Andy

Ten Tips for using HPC

Working with HPC can be exciting, innovative and rewarding. But it can also be complex, frustrating and challenging. Andy Turner is EPCC’s CSE Architect, he works with RSEs, UK HPC-SIG, DiRAC, The Carpentries and various user communities to share and improve HPC epertise. We asked Andy for his Top 10...

ARCHER2 Calendar November 2023

Magnetic field filaments generated by the current filamentation instability Dr Elisabetta Boella, Lancaster University, Physics Department One of the long-standing questions in plasmas physics, the science that studies ionized gases called plasmas, regards the origin and evolution of magnetic field in plasmas where such field is initially absent. The question...

ARCHER2 Calendar October 2023

Scattering of waves by turbulent vortices in geophysical fluids Dr Hossein Kafiabad, School of Mathematics, University of Edinburgh When the waves travel in the atmosphere and ocean interior, they get scattered by the vortices leading to changes in their direction and wavelength. These changes are not random and can be...

BCDR Test 2023

What happens once every two years, no one quite knows when, though there is a lot of speculation, and no one know what form it will take? The EPCC ARCHER2 business continuity and disaster recover (BCDR) test. This tests how well the ARCHER2 team recognises a major scenario, responds to...

The un-secret diary of a first-time RSECon attendee (aged 59¼)

RSECon23 was held this year in sunny Swansea 5th-7th September. Whilst my work at EPCC is not actually that of RSE, it is certainly RSE-adjacent; I support the work of RSEs both here in EPCC and also around the country (sometimes even around the world). My roles include ARCHER2, Cirrus...

ARCHER2 Calendar September 2023

One year in the southwestern Indian Ocean Noam Vogt-Vincent, Department of Earth Sciences, University of Oxford * * * Winning Video, ARCHER2 Image and Video Competition 2022 * * * This video shows one year of sea-surface temperatures from WINDS-C, a 1/50° (c. 2km) resolution ocean model that covers almost...

Nuffield Research Placements at EPCC

The Nuffield Research Placements (NRP) programme provides hands-on research projects for 16 and 17-year-old school students. EPCC is currently supervising three student visitors. Read the full post on the EPCC blog page

ARCHER2 Calendar August 2023

Maternal blood flow through the intervillous space of human placenta Dr Qi Zhou, The University of Edinburgh, School of Engineering, Institute for Multiscale Thermofluids A snapshot of the maternal blood flow simulated using the immersed-boundary-lattice-Boltzmann method as a suspension of deformable red blood cells through the extravascular intervillous space of...

HPC Summer School 2023

EPCC’s Ben Morse introduces the new HPC Summer School fo undergraduate students. Read the full post on the EPCC blog page

The Big Bang Fair 2023: the bigger bang strikes back

EPCC’s Ben Morse reviews EPCC attendance at this year’s The Big Bang Fair, one of the UK’s largest science and technology outreach events. Read the full post on the EPCC blog page

ARCHER2 Calendar July 2023

Cloud Development - Cross-section View Domantas Dilys, University of Leeds, School of Earth and Enviroment In the animation, an idealised cloud is shown, which is produced by a rising warm and moist air mass. There are seven timesteps, showing cloud development over time. Simulation was produced using a revolutionary parcel-based...

Service Desk Operators' blog

The Service Desk is often the first point of contact between the Service team and the outside world. We asked a couple of recent recruits to the Service Desk team to share their experience and insights into this very important role. There are two shifts per day 8am-1pm and 1pm-6pm,...

Simulating Quantum Circuits on GPUs

The field of quantum computing is developing at a rapid pace, with exciting developments in every sub-field including hardware development and algorithm design. Read the full post on the Cirrus blog page

ARCHER2 Calendar June 2023

Expiratory particle dispersion by turbulent exhalation jet during speaking Aleksandra Monka, University of Birmingham, Department of Civil Engineering * * * Winning Image and overall competition winning entry, ARCHER2 Image and Video Competition 2022 * * * The image reveals the extent of expiratory particle dispersion by the turbulent exhalation...

ARCHER2 Calendar May 2023

Simulation of 3D calving dynamics at Jakobshavn Isbrae Iain Wheel, University of St Andrews, School of Geography and Sustainable Development Calving is the breaking off of icebergs at the front of tidewater glaciers (those that flow into the sea). It accounts for around half of the ice mass loss from...

ARCHER2 Calendar April 2023

Proton tunnelling during DNA strand separation Max Winokan, University of Surrey, Quantum Biology DTC Proton transfer between the DNA bases can lead to mutagenic Guanine-Cytosine tautomers. Over the past several decades, a heated debate has emerged over the biological impact of tautomeric forms. In our work, we determine that the...

ARCHER2 Calendar March 2023

Flow within and around a large wind farm Dr Nikolaos Bempedelis, Imperial College London, Department of Aeronautics * * * Winning Early Career entry, ARCHER2 Image and Video Competition 2022 * * * Modern large-scale wind farms consist of multiple turbines clustered together in wind-rich sites. Turbine clustering suffers some...

Rust in HPC

This technical report by Laura Moran and Mark Bull from EPCC investigates how appropriate the Rust programming language is for HPC systems by using a simple computational fluid dynamics (CFD) code to compare between languages. Rust is a relatively new programming language and is growing rapidly in popularity. Scientific programming...

Hosting and operating the ARCHER2 service

Many of the articles about ARCHER2 contain a statement such as “ARCHER2 is hosted and operated by EPCC at the University of Edinburgh”. But what does this actually mean? This recent article takes a look at what is involved.

ARCHER2 Calendar February 2023

The self-amplification and the structured chaotic sea Juan Carlos Bilbao-Ludena, Imperial College London The picture shows the instantaneous generation of strain (light blue) by the mechanism of self-amplification of strain (purple contour) in a turbulent flow generated by a wing. A nonlinear process distinctive to turbulence. The contour slices show...

eCSE Early Career Observer experience

The ARCHER2 eCSE programme provides funding to carry out development of software used by the ARCHER2 user community. As part of our commitment to encouraging Early Career Researchers and to developing their skills and experience, we periodically offer the opportunity for a small number of such researchers to attend the...

ARCHER2 CSE team at SC22

In November 2022, myself and William Lucas attended SC22 in Dallas to present an invited talk on behalf of the ARCHER2 CSE team at the ‘RSEs in HPC’ technical program workshop. SC, or the International Conference for High Performance Computing, Networking, Storage and Analysis, is an annual international conference which...

Ensuring continuity of service at the ACF

EPCC’s Advanced Computing Facility (ACF) delivers a world-class environment to support the many computing and data services which we provide, including ARCHER2. This recent article takes a behind-the-scenes look at some of the activities the ACF team undertakes to provide the stable services our users expect.

ARCHER2 Calendar January 2023

Acetylene Molecule subject to intense ultra-fast laser pulse, time evolution of electron probability, sound Dale Hughes, Queen's University Belfast, Physics The movie is a straightforward plot of electron density, derived from substantial Time-Dependent Density Functional Theory calculations carried out in recent months on ARCHER2. The sound is derived from the...

ARCHER2 Festive Period

The ARCHER2 Service will observe the UK Public holidays and will be closed for Christmas Day, Boxing Day and New Year’s Day. Because Christmas Day and New Year’s Day fall on Sundays, we will take the official Substitute Day holidays on the 27th and 2nd. We fully expect the ARCHER2...

Impact of CPU frequency on application performance on ARCHER2

On the 12 December 2022 the default CPU frequency on ARCHER2 compute nodes was set to be 2.0 GHz. Previously, the CPU frequency was unset which meant that the base frequency of the CPU, 2.25 GHz, was almost always used by jobs running on the compute nodes. In this post,...

Computing Insights UK 2022

Computing Insights UK 2022 was the first conference I attended ever since I began my journey in HPC. It took place in Manchester with the main theme being “Sustainable HPC”. During my two days there I was able to speak to some great professionalists and like-minded people. I have learned...

Iso Success

We are pleased to announce that we have recently passed a flurry of ISO certifications. We had a combined external audit for ISO 9001 Quality and 27001 Information Security from our certification body DNV in September. This looked at how we run the ARCHER2 and Cirrus services on the ISO...

Upcoming training courses in 2022: step up your expertise on ARCHER2!

Training is one of the elements that the ARCHER2 service provides to users. Our training programme is comprised of face-to-face and online courses, that range from introductory to advanced levels. We have scheduled several courses for the remaining of 2022. The list can be found as follows: ARCHER2 for Package...

ARCHER2: Meet the team

We are very pleased to launch the new ARCHER2 support staff page where you can find pictures and a short bio of all the staff helping to keep the ARCHER2 service running. Whether you are trying to find who has been helping you on the Servicedesk, or trying to remember...

Software power draw on ARCHER2 - the benefits of being an RSE

One of the big benefits of being a research software engineer (RSE) providing general support for a compared to a researcher or being focussed on a specific project is the ability to take a step back from the specific aspects of particular software and look at the more global view...

ARCHER2 eCSE Projects Final Reports

We have now started to publish the Final Reports from completed Embeded CSE (eCSE) Projects You can read the reports on the eCSE Final Project page

ARCHER2 Calendar October 2022

Particles surf on plasma waves excited by high-power lasers Nitin Shukla, Instituto Superior Technico * * * Winning Image and overall winner, ARCHER2 Image and Video Competition 2020 * * * Particle accelerators are fundamental tools for science and society, enabling scientific discoveries and providing unprecedented insights into fundamental propterties...

ARCHER2 Image Competition 2021 overall winner

The ARCHER2 Image competition is currently open for 2022. Last years overall winning entry was the Early Career video entry: Sea surface temperature (colour) and ice concentration (grey shades) in the Greenland-Scotland Ridge region. Dr Mattia Almansi, Marine Systems Modelling - National Oceoanography Centre The video shows the evolution of...

ARCHER2 Image Competition 2021 winning video

The ARCHER2 Image competition is currently open for 2022. Last years winning video was: Shock wave interaction with cavitation bubble. Dr Panagiotis Tsoutsanis, Centre for Computational Engineering Sciences - Cranfield University Interaction of a shock wave moving with Mach=2.4 and a gas-filled water bubble. These types of cavitation-bubbles can be...

ARCHER2 Image Competition 2021 winning image

The ARCHER2 Image competition is currently open for 2022. Last years winning image was: Acoustic field (yellow and blue) with overlapped velocity isosurface (red) obtained with a coupled LES-High order Acoustic coupled solver of an installed jet case Dr Miguel Moratilla-Vega, Loughborough University/Aeronautical and Automotive Engineering Department The image reveals...

How to prepare a successful Technical Assessment

There are a number of ways to get access to ARCHER2 (see https://www.archer2.ac.uk/support-access/access.html), and all of the ways that provide a significant amount of compute time require the completion of a Technical Assessment (TA). TAs are meant to ensure that it is technically feasible to run the applications users have...

ARCHER2 Training course report: GROMACS for ARCHER2 users

EPCC provides a variety of training courses as part of the ARCHER2 national supercomputing service. These include introductory, advanced, and domain-specific options. Stephen Farr reports on a recent GROMACS course delivered as part of the ARCHER2 Training. A course which we have recently delivered is “Introduction to GROMACS”. GROMACS is...

Getting started using ARCHER2 and HPC

The Service Desk team have helped more than a few people get from ‘zero’ to ‘HPC whizz’, so here we summarise all the useful information we have compiled over the years. This isn’t meant as a how-to guide, but a signpost to all the resources and materials which we think...

Business continuity management for ARCHER2

If you run a data centre building full of expensive kit used to run a variety of services including ARCHER2, how do you minimise the chances of something negatively impacting the services you run, and how do you decide what to work on first if something does happen that impacts...

Commercial access to ARCHER2 and Cirrus

EPCC provides world-class supercomputing and data facilities and services for science and business. We are a leading centre in the field, renowned globally for research, innovation and teaching excellence. EPCC has three key foundations: the hosting, provision and management of high performance computing (HPC) and data facilities for academia and...

Investigating the performance of NAMD and NEMO on ARCHER2

In this blog, I will show results for the optimisation and tuning of NAMD and NEMO based on the ARCHER2 architecture. We looked at the placement of parallel processes and varying the runtime configurations to generalise optimisations for different architectures to guide users in their own performance tests. We found...

Computing Insight UK: Lots of computing, insights, and nice to be back meeting in person!

In December EPCC was involved in attending and contributing to Computing Insight UK (CIUK) which was held in Manchester over two days. With players from across UK academia and industry, this annual conference focusses on the UK’s contribution to HPC and is a great opportunity to hear about new trends,...

Software usage data on ARCHER2

In this blog, I will introduce how we collect software usage data from Slurm on ARCHER2, introduce the sharing of this data on the ARCHER2 public website and then have a look at differences (or not!) in software usage on ARCHER2 between two months: December 2021: the initial access period...

ARCHER2 Driving Test launched

We are delighted to announce that the ARCHER2 Driving Test is now available. The ARCHER2 Driving Test is an online assessment tool which allows those new to ARCHER2 to demonstrate that they are sufficiently familiar with ARCHER2 and HPC to start making use of it. It is suitable for anyone...

Disaster recovery testing to improve resilience

To keep the ARCHER2 National HPC Service running around the clock requires specialised staff, covering everything from the Service Desk and science support, through to hardware maintenance and data-centre hosting, alongside third-party suppliers for power, networking, accommodation, and so on. Coordinating these elements is a complex task, even in normal...

Introducing the ARCHER2 full system

The ARCHER2 full system was opened to users on the morning of Monday 22 November 2021. In this blog post I introduce the full system and its capabilities, and look forward to additional developments on the service in the future. TL;DR For those who just want to get stuck in...

ISO External Audit Success

EPCC are delighted to be able to announce that we have passed our ISO 9001 Quality and 27001 Information Security external audits with flying colours. We put the highest importance on service delivery and secure handling of customer data throughout the year. Despite this, it is still a nerve racking...

DEA Audits – a world of pain

In spring 2020 EPCC became one of the very few organisations in the UK to become accredited data processors under the Digital Economy Act (DEA), one of exactly 8 accredited data processors in the UK! This was for the hosting and technical management of the National Safe Haven. EPCC is...

Using Containers to Install GUI-based Tools on ARCHER2

We have all become accustomed to having a wide range of pre and post-processing tools available to us on our laptops, which can make working on the login nodes of a large HPC system such as ARCHER2 rather inconvenient if your favourite tools aren’t available. On something fairly standard like...

A Container Factory for HPC

This blog post follows on from “HPC Containers?”, which showed how to run a containerized GROMACS application across multiple compute nodes. The purpose of this post is to explain how that container was made. We turn now to the container factory, the environment within which containers are first created and...

HPC Containers?

Containers are a convenient means of encapsulating complex software environments, but can this convenience be realised for parallel research codes? Running such codes costs money, which means that code performance is often tuned to specific supercomputer platforms. Therefore, for containers to be useful in the world of HPC, it must...

ARCHER2 final cabinets arrive

This week the final cabinets to complete the full ARCHER2 system have arrived onsite in Edinburgh on multiple trucks. The boxes were so large that doors had to be removed. All cabinets are now safely in place, with water cooling enabled. Further work to integrate them into the system is...

ARCHER2 Job Priority

An HPC service such as ARCHER2 manages thousands of user-submitted jobs per day. A scheduler is used to accept, prioritise and run this work. In order to control how jobs are scheduled all schedulers have features for defining the manner in which work is prioritised. On the ARCHER and ARCHER2...

CSE Team gain access to the main ARCHER2 system

Today saw a significant milestone in our move towards our full 23 cabinet ARCHER2 system. A small group of CSE staff gained access to the main HPE Cray EX supercomputing system today and are currently putting it through its paces. Andy, William, David, Adrian, Julien and Kevin will be testing...

Training on ARCHER2, what's next?

The first year of the ARCHER2 service has been very challenging, mainly due to the COVID-19 pandemic. Nonetheless, we have successfully delivered a fully online training programme. Since April 2020, a total of 66 days of training were delivered under the ARCHER2 service. We have used Blackboard Collaborate software for...

Meet the ARCHER2 team – UKRI

ARCHER2 is funded by UKRI through two partner councils, EPSRC and NERC, who started building the case for the system back in 2016. Rebecca How from EPSRC’s Research Infrastructure team joined the ARCHER2 project in January 2019, taking over the Project Manager role for the upcoming service, while also acting...

ARCHER2 MPI with OpenMP mini-app benchmarks

Quite a few application codes running on ARCHER2 are implemented using both MPI and OpenMP. This introduces an extra parameter that determines performance on a given number of nodes - the number of OpenMP threads per MPI process. The optimum value depends on the application, but is also influenced by...

Software use on ARCHER2 - an initial look

Back in February, I reviewed the usage of different research software on ARCHER over a large period of its life. Now we have just come to the end of the first month of charged use on the ARCHER2 4-cabinet system I thought it would be interesting to have an initial...

Urgent atmospheric modelling of an active volcano using ARCHER2

In the evening of March 19th 2021 a volcanic eruption started in Fagradalsfjall, Iceland. Although the episode posed no threat to aviation (no ash was produced), significant amounts of volcanic gases were (and still are being) released. Such gases can cause respiratory problems, and if the concentrations are high enough,...

Attending the “Efficient use of the HPE Cray EX Supercomputer ARCHER2” course

With the end of the ARCHER service in January 2021, the ARCHER2 4-cabinet pilot system has now been operating as the national service for three months. A new architecture, programming environments, tools and scheduling system is a new challenge for users to experiment with and discover techniques to achieve optimal...

Meet the ARCHER2 team - HPC Systems

The HPC Systems Team provides the System Development and System Operations functions for ARCHER2 - but who are we and what do we do? We are a team of 15 System Administrators and Developers who work to deploy, manage and maintain the services and systems offered by EPCC, as well...

Hello ARCHER2 - Install video

We recently received the main ARCHER2 hardware at the ACF and our team recorded the process of installation as this exciting new system was deployed. You’ll see large “Mountain” cabinets being deployed each of which holds 256 Compute Nodes with 128 CPU cores each as well as a number of...

Future investment in Science

It seems strange to be thinking about how to justify investment in ARCHER3 and beyond, with ARCHER2 not fully in service yet, but it is never too early to start planning this. ARCHER generated a significant amount of world-leading science and we fully anticipate ARCHER2 will as well. It is...

Hello Archer2

It’s been a busy week at the Advanced Computing Facility (ACF) with the arrival of the remaining ARCHER2 cabinets. The long journey started from the HPE Cray factory in Chippewa Falls, Wisconsin (birthtown of Seymour Cray) before arriving in Prestwick airport. It took three separate journeys of four lorries to...

ARCHER Quiz

Having said Farewell to ARCHER, we invite you to try our short quiz, to see how much you can remember about the service.

eCSE Panel Meeting - Behind the scenes

The ARCHER2 eCSE programme provides funding to carry out development of software used by the ARCHER2 user community. As part of our commitment to encouraging and developing Early Career Researchers, we offer a small number of early career researchers the opportunity to attend the eCSE Panel Meeting as observers. The...

From the Service Desk

Last week marked the end of the ARCHER Service after seven years. You may have heard some statistics over the last week about ARCHER but I wanted to tell you some from the helpdesk. During the lifetime of ARCHER, the User Support and Systems teams (SP) have resolved 57,489 contractual...

Research Software Use on ARCHER

Now that the ARCHER service has finished, I thought it would be interesting to take a brief look at the use of different research software on the service over its lifetime. ARCHER was in service for over 7 years: from late 2013 until its shutdown in early 2021. Over the...

ARCHER Driving Test retrospective

Back in February 2015, a little over a year after the ARCHER service began, EPCC launched an entirely new and innovative access route, for users to get time on ARCHER via a “Driving Test” The idea behind this was to help those who had never used ARCHER, or possibly any...

ARCHER's final moments

As the ARCHER service was switched off, our colleagues at the ACF photographed and recorded ARCHER’s final moments. With a particular thank you to Connor, Aaron, Jordan, Craig and Paul at the ACF and to Spyro for putting this video together, we give you ARCHER’s final moment. It is best...

Farewell to ARCHER

At 8am this morning the ARCHER service was switched off. Funded by EPSRC and NERC, this sees the end of a remarkable service, a service that has acted as the UK’s National HPC Service for the last 7 years. Entering service at the end of 2013, just over 5.6 million...

ARCHER2 Early Access - communication and community

We are currently in (or have recently finished, depending on when you read this) the Early Access period for the ARCHER2 service. During this period, users nominated by their peers and the ARCHER2 teams at EPCC and HPE Cray have had access to the ARCHER2 4-cabinet system. The Early User...

External audits and cake in a time of plague

With ARCHER2 part arrived and in early access, and with all the preparation to move from ARCHER to ARCHER2, with staff working remotely from home and all the other work ongoing, why on earth were we preparing for our annual ISO external audit and how could we make it work?...

Testing GROMACS on ARCHER2 4-cab

One of the upsides of working on the ARCHER2 CSE team is that, sometimes, one finds oneself in the interesting position of being the only user on the ARCHER2 four-cabinet system (at least the only apparent user – I’m sure that AdrianJ is lurking on one of the compute nodes...

ARCHER2 – Build Timelapse

On the 14th July 2020 the first four cabinets of ARCHER2 arrived at the ACF and over the following few days they were unpacked, moved into position, connected up and ultimately powered up ready to go. Gregor Muir was on hand, capturing the whole process in still images and timelapse...

ARCHER2 Build Intern log

A full week on from the 4 Cab Archer2 installation (affectionately dubbed Mini-Archer2, due to only having 18% of the full Shasta system’s 5848 compute nodes), and life is beginning to quiet down again for the temporary Summer Team at the ACF. This is now the second year that the...

ARCHER2 Build Final Day

Final build day! The final day of the build of our 4 cabinet system. Yesterday saw the system booted and lots of testing carried out. First job of the day was to locate some erroneous cables that were causing power to not be read properly. Troubleshooting then identified the need...

Day 4 of the ARCHER2 Build

Day 4 and the vast majority of the work has been done, with the power commissioned and the management and storage systems configured and tested. As of last night, remote access was also enabled. As we near completion a number of our American colleagues have now left to return to...

Day 3 of the ARCHER2 Build

Day 3 and significant progress has been made. Yesterday saw the fitting of the cooling infrastructure between the mountain cabinets and the CDU. The site’s water supply has now been connected, bled and made live. All the power connections have been made and the CDU has been powered up as...

ARCHER2 – Build Day 2

Yesterday saw the first phase of ARCHER2 arrive on site, with all four Shasta Mountain cabinets moved in to their correct position. The images below show pictures of the back and the front of these cabinets. The red and blue cables are colour coded water pipes for the cooling system,...

ARCHER2 – 4 cabinets arrive

The first phase of ARCHER2 is in Edinburgh! The 4 cabinet Shasta Mountain system, the first phase of the 23 cabinet system, has completed its journey from Chippewa Falls in Wisconsin, making its way from Prestwick airport to Edinburgh this morning. The arrival of these large crates has, I admit,...

ARCHER2 – 4 cabinets ship from the US

Covid-19 has created significant challenges for the delivery of the new ARCHER2 system. It is therefore really exciting to see the first 4 cabinets of ARCHER2 leave Cray/HPE’s factory in Chippewa Falls, Wisconsin to begin their journey to Edinburgh. ARCHER2 will replace the current ARCHER system, a Cray XC30, as...

UKRI ARCHER2 update

UKRI are having weekly meetings with the ARCHER2 providers, EPCC and Cray/HPE. We are making good progress and are still on track to start installation of some compute capacity (a pilot system) in mid-July. We are also looking at further ARCHER extensions due to the COVID-19 delay. We now expect...

UKRI Announce ARCHER2 services contracts awarded

UK Research and Innovation (UKRI) has awarded contracts to run elements of the next national supercomputer, ARCHER2, which will represent a significant step forward in capability for the UK’s science community. EPCC has been awarded contracts to run the Service Provision (SP) and Computational Science and Engineering (CSE) services for...

Welcome to the ARCHER2 website!

We are pleased to welcome you to the new ARCHER2 website. Here you will find all the information about the service including updates on progress with the installation and setup of the new machine. Some sections of the site are still under development but we are actively working to ensure...

UKRI ARCHER2 Hardware Announcement

Details of the ARCHER2 hardware which will be provided by Cray (an HPE company). Following a procurement exercise, UK Research and Innovation (UKRI) are pleased to announce that Cray have been awarded the contract to supply the hardware for the next national supercomputer, ARCHER2. ARCHER2 should be capable on average...