Filter news by tags :
01 May 2023
ARCHER2 Calendar
Simulation of 3D calving dynamics at Jakobshavn Isbrae Iain Wheel, University of St Andrews, School of Geography and Sustainable Development Calving is the breaking off of icebergs at the front of tidewater glaciers (those that flow into the sea). It accounts for around half of the ice mass loss from...
01 April 2023
ARCHER2 Calendar
Proton tunnelling during DNA strand separation Max Winokan, University of Surrey, Quantum Biology DTC Proton transfer between the DNA bases can lead to mutagenic Guanine-Cytosine tautomers. Over the past several decades, a heated debate has emerged over the biological impact of tautomeric forms. In our work, we determine that the...
01 March 2023
ARCHER2 Calendar
Flow within and around a large wind farm Dr Nikolaos Bempedelis, Imperial College London, Department of Aeronautics * * * Winning Early Career entry, ARCHER2 Image and Video Competition 2022 * * * Modern large-scale wind farms consist of multiple turbines clustered together in wind-rich sites. Turbine clustering suffers some...
13 February 2023
Laura Moran EPCC
This technical report by Laura Moran and Mark Bull from EPCC investigates how appropriate the Rust programming language is for HPC systems by using a simple computational fluid dynamics (CFD) code to compare between languages. Rust is a relatively new programming language and is growing rapidly in popularity. Scientific programming...
01 February 2023
Lorna Smith
Many of the articles about ARCHER2 contain a statement such as “ARCHER2 is hosted and operated by EPCC at the University of Edinburgh”. But what does this actually mean? This recent article takes a look at what is involved.
01 February 2023
ARCHER2 Calendar
The self-amplification and the structured chaotic sea Juan Carlos Bilbao-Ludena, Imperial College London The picture shows the instantaneous generation of strain (light blue) by the mechanism of self-amplification of strain (purple contour) in a turbulent flow generated by a wing. A nonlinear process distinctive to turbulence. The contour slices show...
18 January 2023
Thomas Hicken, and eCSE team
The ARCHER2 eCSE programme provides funding to carry out development of software used by the ARCHER2 user community. As part of our commitment to encouraging Early Career Researchers and to developing their skills and experience, we periodically offer the opportunity for a small number of such researchers to attend the...
16 January 2023
Eleanor Broadway
In November 2022, myself and William Lucas attended SC22 in Dallas to present an invited talk on behalf of the ARCHER2 CSE team at the ‘RSEs in HPC’ technical program workshop. SC, or the International Conference for High Performance Computing, Networking, Storage and Analysis, is an annual international conference which...
10 January 2023
Calum Muir, ACF team
EPCC’s Advanced Computing Facility (ACF) delivers a world-class environment to support the many computing and data services which we provide, including ARCHER2. This recent article takes a behind-the-scenes look at some of the activities the ACF team undertakes to provide the stable services our users expect.
04 January 2023
ARCHER2 Calendar
Acetylene Molecule subject to intense ultra-fast laser pulse, time evolution of electron probability, sound Dale Hughes, Queen's University Belfast, Physics The movie is a straightforward plot of electron density, derived from substantial Time-Dependent Density Functional Theory calculations carried out in recent months on ARCHER2. The sound is derived from the...
13 December 2022
ARCHER2 Support team
The ARCHER2 Service will observe the UK Public holidays and will be closed for Christmas Day, Boxing Day and New Year’s Day. Because Christmas Day and New Year’s Day fall on Sundays, we will take the official Substitute Day holidays on the 27th and 2nd. We fully expect the ARCHER2...
12 December 2022
Andy Turner (EPCC)
On the 12 December 2022 the default CPU frequency on ARCHER2 compute nodes was set to be 2.0 GHz. Previously, the CPU frequency was unset which meant that the base frequency of the CPU, 2.25 GHz, was almost always used by jobs running on the compute nodes. In this post,...
08 December 2022
Kris Tanev (EPCC)
Computing Insights UK 2022 was the first conference I attended ever since I began my journey in HPC. It took place in Manchester with the main theme being “Sustainable HPC”. During my two days there I was able to speak to some great professionalists and like-minded people. I have learned...
15 November 2022
Anne Whiting EPCC
We are pleased to announce that we have recently passed a flurry of ISO certifications. We had a combined external audit for ISO 9001 Quality and 27001 Information Security from our certification body DNV in September. This looked at how we run the ARCHER2 and Cirrus services on the ISO...
08 November 2022
Juan Rodriguez Herrera
Training is one of the elements that the ARCHER2 service provides to users. Our training programme is comprised of face-to-face and online courses, that range from introductory to advanced levels. We have scheduled several courses for the remaining of 2022. The list can be found as follows: ARCHER2 for Package...
25 October 2022
ARCHER2 Support team
We are very pleased to launch the new ARCHER2 support staff page where you can find pictures and a short bio of all the staff helping to keep the ARCHER2 service running. Whether you are trying to find who has been helping you on the Servicedesk, or trying to remember...
10 October 2022
Andy Turner (EPCC)
One of the big benefits of being a research software engineer (RSE) providing general support for a compared to a researcher or being focussed on a specific project is the ability to take a step back from the specific aspects of particular software and look at the more global view...
04 October 2022
ARCHER2 eCSE team
We have now started to publish the Final Reports from completed Embeded CSE (eCSE) Projects You can read the reports on the eCSE Final Project page
03 October 2022
ARCHER2 Calendar
Particles surf on plasma waves excited by high-power lasers Nitin Shukla, Instituto Superior Technico * * * Winning Image and overall winner, ARCHER2 Image and Video Competition 2020 * * * Particle accelerators are fundamental tools for science and society, enabling scientific discoveries and providing unprecedented insights into fundamental propterties...
08 September 2022
Clair Barrass EPCC
The ARCHER2 Image competition is currently open for 2022. Last years overall winning entry was the Early Career video entry: Sea surface temperature (colour) and ice concentration (grey shades) in the Greenland-Scotland Ridge region. Dr Mattia Almansi, Marine Systems Modelling - National Oceoanography Centre The video shows the evolution of...
30 August 2022
Clair Barrass EPCC
The ARCHER2 Image competition is currently open for 2022. Last years winning video was: Shock wave interaction with cavitation bubble. Dr Panagiotis Tsoutsanis, Centre for Computational Engineering Sciences - Cranfield University Interaction of a shock wave moving with Mach=2.4 and a gas-filled water bubble. These types of cavitation-bubbles can be...
22 August 2022
Clair Barrass EPCC
The ARCHER2 Image competition is currently open for 2022. Last years winning image was: Acoustic field (yellow and blue) with overlapped velocity isosurface (red) obtained with a coupled LES-High order Acoustic coupled solver of an installed jet case Dr Miguel Moratilla-Vega, Loughborough University/Aeronautical and Automotive Engineering Department The image reveals...
03 August 2022
Alexei Borissov EPCC
There are a number of ways to get access to ARCHER2 (see https://www.archer2.ac.uk/support-access/access.html), and all of the ways that provide a significant amount of compute time require the completion of a Technical Assessment (TA). TAs are meant to ensure that it is technically feasible to run the applications users have...
02 August 2022
Stephen Farr EPCC
EPCC provides a variety of training courses as part of the ARCHER2 national supercomputing service. These include introductory, advanced, and domain-specific options. Stephen Farr reports on a recent GROMACS course delivered as part of the ARCHER2 Training. A course which we have recently delivered is “Introduction to GROMACS”. GROMACS is...
18 May 2022
EPCC
The Service Desk team have helped more than a few people get from ‘zero’ to ‘HPC whizz’, so here we summarise all the useful information we have compiled over the years. This isn’t meant as a how-to guide, but a signpost to all the resources and materials which we think...
02 May 2022
Anne Whiting EPCC
If you run a data centre building full of expensive kit used to run a variety of services including ARCHER2, how do you minimise the chances of something negatively impacting the services you run, and how do you decide what to work on first if something does happen that impacts...
10 March 2022
Thomas Blyth EPCC
EPCC provides world-class supercomputing and data facilities and services for science and business. We are a leading centre in the field, renowned globally for research, innovation and teaching excellence. EPCC has three key foundations: the hosting, provision and management of high performance computing (HPC) and data facilities for academia and...
03 March 2022
Eleanor Broadway EPCC
In this blog, I will show results for the optimisation and tuning of NAMD and NEMO based on the ARCHER2 architecture. We looked at the placement of parallel processes and varying the runtime configurations to generalise optimisations for different architectures to guide users in their own performance tests. We found...
14 February 2022
Nick Brown EPCC
In December EPCC was involved in attending and contributing to Computing Insight UK (CIUK) which was held in Manchester over two days. With players from across UK academia and industry, this annual conference focusses on the UK’s contribution to HPC and is a great opportunity to hear about new trends,...
07 February 2022
Andy Turner (EPCC)
In this blog, I will introduce how we collect software usage data from Slurm on ARCHER2, introduce the sharing of this data on the ARCHER2 public website and then have a look at differences (or not!) in software usage on ARCHER2 between two months: December 2021: the initial access period...
24 January 2022
ARCHER2 Training Team
We are delighted to announce that the ARCHER2 Driving Test is now available. The ARCHER2 Driving Test is an online assessment tool which allows those new to ARCHER2 to demonstrate that they are sufficiently familiar with ARCHER2 and HPC to start making use of it. It is suitable for anyone...
14 January 2022
George Beckett EPCC
To keep the ARCHER2 National HPC Service running around the clock requires specialised staff, covering everything from the Service Desk and science support, through to hardware maintenance and data-centre hosting, alongside third-party suppliers for power, networking, accommodation, and so on. Coordinating these elements is a complex task, even in normal...
22 November 2021
Andy Turner (EPCC)
The ARCHER2 full system was opened to users on the morning of Monday 22 November 2021. In this blog post I introduce the full system and its capabilities, and look forward to additional developments on the service in the future. TL;DR For those who just want to get stuck in...
30 September 2021
Anne Whiting EPCC
EPCC are delighted to be able to announce that we have passed our ISO 9001 Quality and 27001 Information Security external audits with flying colours. We put the highest importance on service delivery and secure handling of customer data throughout the year. Despite this, it is still a nerve racking...
29 September 2021
Anne Whiting EPCC
In spring 2020 EPCC became one of the very few organisations in the UK to become accredited data processors under the Digital Economy Act (DEA), one of exactly 8 accredited data processors in the UK! This was for the hosting and technical management of the National Safe Haven. EPCC is...
21 September 2021
David Henty EPCC
We have all become accustomed to having a wide range of pre and post-processing tools available to us on our laptops, which can make working on the login nodes of a large HPC system such as ARCHER2 rather inconvenient if your favourite tools aren’t available. On something fairly standard like...
25 August 2021
Michael Bareford (EPCC)
This blog post follows on from “HPC Containers?”, which showed how to run a containerized GROMACS application across multiple compute nodes. The purpose of this post is to explain how that container was made. We turn now to the container factory, the environment within which containers are first created and...
06 August 2021
Michael Bareford (EPCC)
Containers are a convenient means of encapsulating complex software environments, but can this convenience be realised for parallel research codes? Running such codes costs money, which means that code performance is often tuned to specific supercomputer platforms. Therefore, for containers to be useful in the world of HPC, it must...
28 July 2021
Anne Whiting EPCC
This week the final cabinets to complete the full ARCHER2 system have arrived onsite in Edinburgh on multiple trucks. The boxes were so large that doors had to be removed. All cabinets are now safely in place, with water cooling enabled. Further work to integrate them into the system is...
15 July 2021
Kieran Leach (EPCC)
An HPC service such as ARCHER2 manages thousands of user-submitted jobs per day. A scheduler is used to accept, prioritise and run this work. In order to control how jobs are scheduled all schedulers have features for defining the manner in which work is prioritised. On the ARCHER and ARCHER2...
24 June 2021
Lorna Smith
Today saw a significant milestone in our move towards our full 23 cabinet ARCHER2 system. A small group of CSE staff gained access to the main HPE Cray EX supercomputing system today and are currently putting it through its paces. Andy, William, David, Adrian, Julien and Kevin will be testing...
08 June 2021
Juan Herrera EPCC
The first year of the ARCHER2 service has been very challenging, mainly due to the COVID-19 pandemic. Nonetheless, we have successfully delivered a fully online training programme. Since April 2020, a total of 66 days of training were delivered under the ARCHER2 service. We have used Blackboard Collaborate software for...
01 June 2021
Rebecca How EPSRC
ARCHER2 is funded by UKRI through two partner councils, EPSRC and NERC, who started building the case for the system back in 2016. Rebecca How from EPSRC’s Research Infrastructure team joined the ARCHER2 project in January 2019, taking over the Project Manager role for the upcoming service, while also acting...
01 June 2021
Mark Bull (EPCC)
Quite a few application codes running on ARCHER2 are implemented using both MPI and OpenMP. This introduces an extra parameter that determines performance on a given number of nodes - the number of OpenMP threads per MPI process. The optimum value depends on the application, but is also influenced by...
19 May 2021
Andrew Turner (EPCC)
Back in February, I reviewed the usage of different research software on ARCHER over a large period of its life. Now we have just come to the end of the first month of charged use on the ARCHER2 4-cabinet system I thought it would be interesting to have an initial...
11 May 2021
Ralph Burton, Stephen Mobbs, Barbara Brooks, James Groves.
National Centre for Atmospheric Science (NCAS), UK
In the evening of March 19th 2021 a volcanic eruption started in Fagradalsfjall, Iceland. Although the episode posed no threat to aviation (no ash was produced), significant amounts of volcanic gases were (and still are being) released. Such gases can cause respiratory problems, and if the concentrations are high enough,...
06 May 2021
Eleanor Broadway (EPCC)
With the end of the ARCHER service in January 2021, the ARCHER2 4-cabinet pilot system has now been operating as the national service for three months. A new architecture, programming environments, tools and scheduling system is a new challenge for users to experiment with and discover techniques to achieve optimal...
12 April 2021
Kieran Leach EPCC
The HPC Systems Team provides the System Development and System Operations functions for ARCHER2 - but who are we and what do we do? We are a team of 15 System Administrators and Developers who work to deploy, manage and maintain the services and systems offered by EPCC, as well...
11 March 2021
Kieran Leach (EPCC)
We recently received the main ARCHER2 hardware at the ACF and our team recorded the process of installation as this exciting new system was deployed. You’ll see large “Mountain” cabinets being deployed each of which holds 256 Compute Nodes with 128 CPU cores each as well as a number of...
04 March 2021
Anne Whiting (EPCC)
It seems strange to be thinking about how to justify investment in ARCHER3 and beyond, with ARCHER2 not fully in service yet, but it is never too early to start planning this. ARCHER generated a significant amount of world-leading science and we fully anticipate ARCHER2 will as well. It is...
19 February 2021
Josephine Beech-Brandt (EPCC)
It’s been a busy week at the Advanced Computing Facility (ACF) with the arrival of the remaining ARCHER2 cabinets. The long journey started from the HPE Cray factory in Chippewa Falls, Wisconsin (birthtown of Seymour Cray) before arriving in Prestwick airport. It took three separate journeys of four lorries to...
11 February 2021
Clair Barrass (EPCC)
Having said Farewell to ARCHER, we invite you to try our short quiz, to see how much you can remember about the service.
09 February 2021
Catherine Inglis (EPCC)
The ARCHER2 eCSE programme provides funding to carry out development of software used by the ARCHER2 user community. As part of our commitment to encouraging and developing Early Career Researchers, we offer a small number of early career researchers the opportunity to attend the eCSE Panel Meeting as observers. The...
05 February 2021
Josephine Beech-Brandt (EPCC)
Last week marked the end of the ARCHER Service after seven years. You may have heard some statistics over the last week about ARCHER but I wanted to tell you some from the helpdesk. During the lifetime of ARCHER, the User Support and Systems teams (SP) have resolved 57,489 contractual...
04 February 2021
Andy Turner (EPCC)
Now that the ARCHER service has finished, I thought it would be interesting to take a brief look at the use of different research software on the service over its lifetime. ARCHER was in service for over 7 years: from late 2013 until its shutdown in early 2021. Over the...
03 February 2021
Clair Barrass (EPCC)
Back in February 2015, a little over a year after the ARCHER service began, EPCC launched an entirely new and innovative access route, for users to get time on ARCHER via a “Driving Test” The idea behind this was to help those who had never used ARCHER, or possibly any...
02 February 2021
Lorna Smith
As the ARCHER service was switched off, our colleagues at the ACF photographed and recorded ARCHER’s final moments. With a particular thank you to Connor, Aaron, Jordan, Craig and Paul at the ACF and to Spyro for putting this video together, we give you ARCHER’s final moment. It is best...
27 January 2021
Lorna Smith (EPCC)
At 8am this morning the ARCHER service was switched off. Funded by EPSRC and NERC, this sees the end of a remarkable service, a service that has acted as the UK’s National HPC Service for the last 7 years. Entering service at the end of 2013, just over 5.6 million...
16 December 2020
Andy Turner
We are currently in (or have recently finished, depending on when you read this) the Early Access period for the ARCHER2 service. During this period, users nominated by their peers and the ARCHER2 teams at EPCC and HPE Cray have had access to the ARCHER2 4-cabinet system. The Early User...
19 November 2020
Anne Whiting EPCC
With ARCHER2 part arrived and in early access, and with all the preparation to move from ARCHER to ARCHER2, with staff working remotely from home and all the other work ongoing, why on earth were we preparing for our annual ISO external audit and how could we make it work?...
27 October 2020
Julien Sindt EPCC
One of the upsides of working on the ARCHER2 CSE team is that, sometimes, one finds oneself in the interesting position of being the only user on the ARCHER2 four-cabinet system (at least the only apparent user – I’m sure that AdrianJ is lurking on one of the compute nodes...
29 July 2020
Clair Barrass
On the 14th July 2020 the first four cabinets of ARCHER2 arrived at the ACF and over the following few days they were unpacked, moved into position, connected up and ultimately powered up ready to go. Gregor Muir was on hand, capturing the whole process in still images and timelapse...
27 July 2020
Gregor Muir
A full week on from the 4 Cab Archer2 installation (affectionately dubbed Mini-Archer2, due to only having 18% of the full Shasta system’s 5848 compute nodes), and life is beginning to quiet down again for the temporary Summer Team at the ACF. This is now the second year that the...
17 July 2020
Lorna Smith
Final build day! The final day of the build of our 4 cabinet system. Yesterday saw the system booted and lots of testing carried out. First job of the day was to locate some erroneous cables that were causing power to not be read properly. Troubleshooting then identified the need...
16 July 2020
Lorna Smith
Day 4 and the vast majority of the work has been done, with the power commissioned and the management and storage systems configured and tested. As of last night, remote access was also enabled. As we near completion a number of our American colleagues have now left to return to...
15 July 2020
Lorna Smith
Day 3 and significant progress has been made. Yesterday saw the fitting of the cooling infrastructure between the mountain cabinets and the CDU. The site’s water supply has now been connected, bled and made live. All the power connections have been made and the CDU has been powered up as...
14 July 2020
Lorna Smith
Yesterday saw the first phase of ARCHER2 arrive on site, with all four Shasta Mountain cabinets moved in to their correct position. The images below show pictures of the back and the front of these cabinets. The red and blue cables are colour coded water pipes for the cooling system,...
13 July 2020
Lorna Smith
The first phase of ARCHER2 is in Edinburgh! The 4 cabinet Shasta Mountain system, the first phase of the 23 cabinet system, has completed its journey from Chippewa Falls in Wisconsin, making its way from Prestwick airport to Edinburgh this morning. The arrival of these large crates has, I admit,...
03 July 2020
Lorna Smith
Covid-19 has created significant challenges for the delivery of the new ARCHER2 system. It is therefore really exciting to see the first 4 cabinets of ARCHER2 leave Cray/HPE’s factory in Chippewa Falls, Wisconsin to begin their journey to Edinburgh. ARCHER2 will replace the current ARCHER system, a Cray XC30, as...
19 June 2020
UKRI
UKRI are having weekly meetings with the ARCHER2 providers, EPCC and Cray/HPE. We are making good progress and are still on track to start installation of some compute capacity (a pilot system) in mid-July. We are also looking at further ARCHER extensions due to the COVID-19 delay. We now expect...
26 March 2020
EPCC
UK Research and Innovation (UKRI) has awarded contracts to run elements of the next national supercomputer, ARCHER2, which will represent a significant step forward in capability for the UK’s science community. EPCC has been awarded contracts to run the Service Provision (SP) and Computational Science and Engineering (CSE) services for...
13 March 2020
EPCC
We are pleased to welcome you to the new ARCHER2 website. Here you will find all the information about the service including updates on progress with the installation and setup of the new machine. Some sections of the site are still under development but we are actively working to ensure...
14 October 2019
UKRI
Details of the ARCHER2 hardware which will be provided by Cray (an HPE company). Following a procurement exercise, UK Research and Innovation (UKRI) are pleased to announce that Cray have been awarded the contract to supply the hardware for the next national supercomputer, ARCHER2. ARCHER2 should be capable on average...