Coupling Across the Continuum-Particle Divide with Code_Saturne and GROMACS
ARCHER2-eCSE08-11
PI: Dr James Gebbie-Rayet (STFC Daresbury Laboratory)
Co-I(s): Dr Wendi Liu (STFC Daresbury Laboratory)
Technical staff: Dr Charles Moulinec (STFC Daresbury Laboratory)
Subject Area:
Published : 2025-07-31
Simulation is becoming ever more important in the life sciences as the complexity of both research problems and information from lab-based experiments increases. This is reinforced by the award of the Nobel Prize in Chemistry in 2013 for the development of multi-scale simulation methodologies for complex chemical systems. In the biomolecular simulation community to date, multi-scale modelling has largely focused on loosely coupling together two particle-based methodologies. However, these couplings are limited in the range of scales and applications that they can deal with. There are a number of methodologies that are capable of running simulations at larger length scales, each of which have slightly different approaches to fluidics and neighbour interactions.
Each of these methodologies either requires detail of the atomic positions which can then be coarsened into larger particle components (beads) or requires the partition of the system using a strategy such that the system is composed of particulate repeat units. A novel mesoscale methodology has been developed at Leeds University which uses a continuum approach to modelling the thermal dynamics of such biological systems using Fluctuating Finite Element Analysis (FFEA), which relies on meshes to describe the biological system. This approach allows orders of magnitude time- and length-scale increase over traditional particle-based methodologies. It is thus possible to be able to very rapidly sample states for mesoscale systems on time scales that are simply unattainable using particle-based methods without unfeasibly long access to very large high performance computing.
This particular project focuses on the implementation of a multi-scale workflow, and not the physics itself, to help understand how large systems containing many proteins at the macro-scale either bind together, or repel each other, when they are situated within a crowded cellular environment as parameterised by micro-scale simulations. To achieve this, information from the micro-scale (solved by Molecular Dynamics (MD)) is required by the macro-scale (solved by a mesh-based method). On the computational side, it means that the protein evolution is simulated at the macro-scale, using a computational structural mechanics (CSM) approach (code_saturne), and potential binding/repulsion is handled at the micro-scale, using a molecular dynamics (MD) approach (GROMACS). A coupling between CSM and MD is required to exchange information between the scales to decide which action follows (binding/sticking or repulsion). Generalising this investigation to many proteins (at least several hundreds of thousands within a single cell) can only be possible using High Performance Computing (HPC).
The coupling between the macro- (code_saturne) and micro- (GROMACS) scales has been carried out using the PLE library. When code_saturne runs, GROMACS is idle, and vice versa, which makes the coupling staggered. To optimise the available computational resources, the DLB library has also been implemented to coordinate access to resources when each of GROMACS or code_saturne is idle.
Results obtained up to 64 nodes of ARCHER2, corresponding to 1 instance of code_saturne coupled with 64 independent simulations coming from GROMACS show that the coupling is very efficient.
Information about the code
EDF will be creating a github/gitlab website for external contributors to Code_Saturne. Until that has been set up, all the developments, and the test case, are available in the /work/c01/shared
folder for people who would like to test what has been implemented in the eCSE projects.