Centres of Excellence

ENCCS is also in tight collaboration with Centres of Excellence and can assist users get information on their supported software as well as get in touch with key people to get the support that they need.
CoEs will promote the use of upcoming exascale and extreme performance computing capabilities and scale up existing parallel codes towards exascale scaling performance. Below you can find every CoE and their supported HPC software in multiple disciplines.

Applications for biomolecular modelling and simulations. BioExcel works together with the core developers of widely used tools for biomolecular modeling and simulations:

  • GROMACS, CP2K, PMX

The main objective of ChEESE is to establish a new Center of Excellence (CoE) in the domain of Solid Earth (SE) targeting the preparation of 10 Community flagship European codes for the upcoming pre-Exascale (2020) and Exascale (2022) supercomputers. 

  • Computational Seismology: ExaHype, Salvus, SeisSol, SPECFEM3D
  • MHD: PARODY_PDAF, XSHELLS
  • Physical Volcanology: ASHEE, FALL3D​​
  • Tsunami Modelling: T-HySEA, L-HySEA

CompBioMed is a European Commission H2020 funded Centre of Excellence focused on the use and development of computational methods for biomedical applications. In more detail it addresses the needs of the computational biomedicine research community within cardiovascular medicine, molecularly-based medicine and neuro-musculoskeletal medicine

  • Cardiovascular: Alya, HemeLB, HermoCell, OpenBF, Palabos, PolNet, InSilicoMRI, Living Heart HumanModel
  • Molecular Medicine: BAC, HTMD, Playmolecule, VisualGec, HTBAC, Virtual Assay
  • Neuro-musculoskeletal Medicine: CT2S, Insigneo Bone Tissue Suit, PalabosCodes

The activity of E-CAM focuses on the need for new and improved algorithms and code modules.

  • Molecular dynamics: LAMMPS, GROMACS, OPS, NAMD
  • Electronic Structure: ESL, ELSI, Wannier90, QMCPACK, QuantumEspresso, SIESTA, Aiida
  • Quantum Dynamics: PaPIM, Quantics, CP2K, Q-Chem, CPMD, ElVibRot
  • Meso- and multi-scale modelling: MP2C, ESPResSo++, DL_MESO_DPD, GC-AdResS

EoCoE drives its efforts into 5 scientific Exascale challenges in the low-carbon sectors of energy: Meteorology, Materials, Water, Wind and Fusion. This multidisciplinary effort will harness innovations in computer science and mathematical algorithms within a tightly integrated co-design approach to overcome performance bottlenecks and to anticipate future HPC hardware developments.

  • Wind4energy: Alya, waLBerla
  • Meteo4energy: ESIAS-Chem, ESIAS-Meteo, EURAD-IM
  • Materials4energy: Metalwalls, QMCPACK, gDFTB, libNEGF, KMC/DMC
  • Water4energy: ParFlow, SHEMAT-Suite, ExaTerr

The Centre of Excellence in Simulation of Weather and Climate in Europe (ESiWACE) enables global storm- and eddy resolving weather and climate simulations on the upcoming (pre-)Exascale supercomputers.

  • NEMO, OASIS3-MCT, Cylc, XIOS, ICON, IFS, Dynamic

Application codes are the core of Excellerat projects since they allow for achieving the cutting-edge results of engineering objectives.A number of codes are officially supported within the Services provided in the context of Excellerat Center of Excellence.

  • Large scale computational mechanics: Alya
  • Combustion instabilities and emission prediction; explosion in confined spaces: AVBP
  • Design process and simulation of full equipped aeroplanes; CFD coupling with computational structural mechanics including elastic effects: CODA
  • Computing platform for solving PDEs: FEniCS
  • Computational fluid dynamics: Nek5000
  • Simulation of two-phase flows: TPLS
  • In-situ analysis tool: PAAKAT
  • Python toolbox for uncertainty quantification: UQit
  • Simulations on supercomputers, post- processing and parallel interactive visualization in immersive virtual environments: Vistle

HiDALGO (HPC and Big Data Technologies for Global Systems) develop novel methods, algorithms and software for HPC and HPDA to accurately model and simulate the complex processes, which arise in connection with major global challenges.

  • Agent-based Modelling: AMOS, RepastHPC, MASON, SUMO, FLEE, MUSCLE2
  • Data Analytics: Apache Spark, Apache Flink, Apache Storm
  • Machine Learning: TensorFlow, Torch
  • Visualization: COVISE, VISTLE
  • Computational Fluid Dynamics (CFD): FEniCS

The mission of MaX is to develop the technologies and make them available for large and growing base of researchers in the materials domain.

  • Codes: Quantum ESPRESSO, SIESTA, YAMBO, FLEUR, CP2K, BigDFT, AiiDA
  • Libraries: CheSS, LAXlib, FFTXLib, SIRIUS, COSMA, SpFFT

NOMAD creates, collects, stores, and cleanses computational materials science data, computed by the most important materials-science codes available today. Furthermore, the NOMAD Laboratory develops tools for mining this data in order to find structure, correlations, and novel information that could not be discovered from studying smaller data sets. NOMAD leads the Open Science Movement in materials science, supported by the global community by making all data freely accessible.

The Performance Optimisation and Productivity Centre of Excellence in HPC provides performance optimisation and productivity services for (your?) academic AND industrial code(s) in all domains. 

  • Computational Fluid Dynamics: SOWFA, DROPS, Ateles, Musubi, OpenFOAM Solver, Nekbone
  • Electronic Structure Calculations: ONETEP, FHI, AIMS, Quantum Espresso, SIESTA, ADF, BAND, DFTB
  • Plasma Turbulence: GS2, iPIC3D
  • Materials Science: QUIP, VAMPIRE, DPM, GraGLeS2D, FIDIMAG, GBmolDD, k-Wave, EPW
  • Earth Sciences: NEMO, SHEMAT-Suite, UKCA, GITM
  • Neuroscience: OpenNN, NEST5g
  • Biomedicine: HemeLB

The TREX Center of Excellence federates European scientists, High Performance Computing (HPC) stakeholders, and SMEs to develop and apply high-performance software solutions for quantum mechanical simulations at the exascale.

  • Co-design of computational kernels of flagship QMC codes with efficient algorithms that are scalable for HPC applications, flexible to adapt to future architectures, and can cater to a large base of HPC users and players in synergy with existing CoEs.
  • Rational design of an ecosystem of highly scalable, optimized, and inter-operable QMC codes for exascale applications of ultimate accuracy in the domain of quantum chemistry and computational material design. Improvement of the codes adopting parallel paradigms able to fully exploit the potentials of exascale architectures.
  • Robust management of complex scalable QMC workflows in high-throughput calculations for materials simulations to further leverage on exascale performance, thereby ensuring the convergence of HPC, HTC, and HPDA.
  • Foster wider access, usage, and uptake of knowledge in HPC via direct involvement of present and potential user communities via demonstrators in the development of our scalable eco-system of QMC codes within an integrated HPC, HTC, and HPDA framework