ARCHER logo ARCHER banner

The ARCHER Service is now closed and has been superseded by ARCHER2.

  • ARCHER homepage
  • About ARCHER
    • About ARCHER
    • News & Events
    • Calendar
    • Blog Articles
    • Hardware
    • Software
    • Service Policies
    • Service Reports
    • Partners
    • People
    • Media Gallery
  • Get Access
    • Getting Access
    • TA Form and Notes
    • kAU Calculator
    • Cost of Access
  • User Support
    • User Support
    • Helpdesk
    • Frequently Asked Questions
    • ARCHER App
  • Documentation
    • User Guides & Documentation
    • Essential Skills
    • Quick Start Guide
    • ARCHER User Guide
    • ARCHER Best Practice Guide
    • Scientific Software Packages
    • UK Research Data Facility Guide
    • Knights Landing Guide
    • Data Management Guide
    • SAFE User Guide
    • ARCHER Troubleshooting Guide
    • ARCHER White Papers
    • Screencast Videos
  • Service Status
    • Detailed Service Status
    • Maintenance
  • Training
    • Upcoming Courses
    • Online Training
    • Driving Test
    • Course Registration
    • Course Descriptions
    • Virtual Tutorials and Webinars
    • Locations
    • Training personnel
    • Past Course Materials Repository
    • Feedback
  • Community
    • ARCHER Community
    • ARCHER Benchmarks
    • ARCHER KNL Performance Reports
    • Cray CoE for ARCHER
    • Embedded CSE
    • ARCHER Champions
    • ARCHER Scientific Consortia
    • HPC Scientific Advisory Committee
    • ARCHER for Early Career Researchers
  • Industry
    • Information for Industry
  • Outreach
    • Outreach (on EPCC Website)

You are here:

  • ARCHER

Can't list the navigation items at the moment. Please try again later.

  • ARCHER Community
  • ARCHER Benchmarks
  • ARCHER KNL Performance Reports
  • Cray CoE for ARCHER
  • Embedded CSE
  • ARCHER Champions
  • ARCHER Scientific Consortia
  • HPC Scientific Advisory Committee
  • ARCHER for Early Career Researchers

Contact Us

support@archer.ac.uk

Twitter Feed

Tweets by @ARCHER_HPC

ISO 9001 Certified

ISO 27001 Certified

Grids in grids: hierarchical grid generation and decomposition for a massively parallel blood flow simulator

eCSE03-13

Key Personnel

PI/Co-I: Dr Rupert Nash (EPCC), Dr Derek Groen (Brunel University London), Prof Peter Coveney (UCL)

Technical: Dr Rupert Nash (EPCC), Dr Derek Groen (Brunel University London)

Relevant documents

eCSE Technical Report: Grids in grids: hierarchical grid generation and decomposition for a massively parallel blood flow simulator

Project summary

Implementing an accurate, consistent, parallel, efficient meshing tool is a very difficult software engineering task. We have created a tool that is accurate and consistent after much effort. Some key parts of the process are parallel and efficient and, for realistic problems, we can now produce voxelisations a factor of ten faster using a 36-core machine.

We have implemeted tools for producing and analysing decompositions and added checkpoint restart capabilities to HemeLB.

Achievement of objectives

The objectives of the work were to:

  1. Reimplement HemeLB setup using hierarchical, geometric data structures.
    This has been achieved. The implementation turned out to be extremely challenging and overran significantly. The key issue relates to the difficulty in reliably determining intersections of links between lattice sites and the triangles of the surface. Since these are typically represented with floating-point numbers, intersections are very occasionally missed (or spurious ones created). This can be disastrous for the process as the fluid can then "leak" out of the domain. Adapting the main program to use the new file format was not completed: a simple post-processing step converts from the new to the old formats.
  2. Extract HemeLB's domain decomposition into a library (usable from outside the main application) to make the decompositions reusable and the algorithms used configurable.
    This has been achieved, and we now have a standalone library (Protopart+PPStee) which is able to apply and analyse a range of decomposition algorithms. Although not explicitly described in this proposal, we also modified the data structures in the code to bypass domain decompositions in HemeLB. However, this proved to be much more complex than expected, and our initial implementation led to errors for very large problems. In collaboration with researchers at the CCS we have proceeded to work on this after GiG was completed, and incorporated the Zoltan library directly in HemeLB. In this way we can use Protopart+PPStee for quick diagnostics and testing, while applying the optimal algorithms directly into HemeLB using the Zoltan library (which supports the same partitioning schemes).
  3. Implement accelerated initialisation of simulations by bootstrapping through one or more coarser discretisations of the domain (using the new hierarchical information).
    This has been partially completed, checkpoint-restart capabilities were added but because we did not finish adapting HemeLB to use the new octree-based format, we could not complete the accelerated bootstrapping easily.
  4. Extend the parallelisation of HemeLB setup across multiple nodes using MPI parallelism.
    Not attempted due to time constraints.

Summary of the Software

HemeLB is open source (LGPL licence) and available from https://www.github.com/UCL/hemelb. The software can be easily compiled on ARCHER.

Scientific Benefits

Because of the partial success of the project, the tool as-is will only have moderate impact: improving setup times for our users.

Copyright © Design and Content 2013-2019 EPCC. All rights reserved.

EPSRC NERC EPCC