HPC scaling at a cellular and synaptic resolution within the Human Brain Project


single_neuron_BlueBrain.jpg A 3D model of a neuron: reconstructed from lab data. The “sprouting” protuberances are “pre-synaptic terminals” – the points where the neuron will form connections (synapses) with other neurons (c) EPFL/Blue Brain Project
18 Jun 2013 Leipzig - The ISC'13 event in Leipzig was hosting a satellite event on supercomputing and the EC-funded Human Brain Project that will run for 10 years. Felix Schürmann from EPFL described the challenges and opportunities within this project from the simulation viewpoint. The project mission consists in building a radically new strategy to understanding the human brain and its dieases and developing brain-like technologies. The ramp-up phase involves 87 European and non-European institutions. In addition, the Human Brain Project is launching a competitive calls programme to enlarge the consortium. The focus is on future neuroscience, future medicine and future computing.

The Human Brain Project team will create the following 6 ICT platforms:

1. Neuroinformatics: a gateway to all data, knowledge and publications on the brain

2. Brain simulation: the capability to reconstruct and simulate the human brain from partial and comparative data using fundamental principles of biology

3. HPC: remotely accessible, multi-scale, interactive exascale supercomputing

4. Medical informatics: service for biologically-based and personal disease diagnosis, treatment and drug development

5. Neuromorphic computing: biologically grounded pipeline for implementing the brain's circuits, mechanisms and principles in computing systems

6. Neurorobotics: biologically ground pipeline for implementing robotic tools

The Human Brain Project is also seeking active integration with other initiatives, including the HPC projects PRACE, DEEP, and Mont-Blanc; science and museums; ADNI, NeuGrid, etc.

The project activities will involve experimental data gathering, unifying brain models, model building, simulation, supercomputing, analysis and visualization, model validation, refinement of models and experiments, worldwide published data, and models.

Felix Schürmann told the audience that the project will try to build the best model possible but it will not be THE best.

Reconstructing and validating unifying brain models with brain atlases is acting as the data source; biological parameter constraints are the configurations, and multi-constraint algorithms involve the brain reconstruction work flows. The project will be using biological meta databases.

The Proof of concept is the Blue Brain which started in 2005 and is founded and directed by Henry Markram.

Felix Schürmann described how the relevant scales go from nanometers to centimeters. The space is about 9 orders of magnitude. The project is dealing with weak scaling and strong scaling.

The speaker provided some fascinating numbers of the brain. The mouse brain has about 70 million nerve cells whereas the human brain amounts to about 90 billion nerve cells. Each cell is a universe.

The weak scaling serves to integrate data into unifying models. The partners will need several generations of tailored but general-purpose supercomputers to come to valuable results.

Detailed brain simulations require capability computing and have a very large memory footprint. For plasticity studies strong scaling is required. Computers are used as scientific instruments and what needs to be computed depends on the question.

For the scaling parameter in the Human Brain Project, numerical methods are used, such as NEST and ODEs for loose global coupling and LinAlg for tight specific coupling.

Capability computing is needed to evolve from a single cellular model to a cellular human brain model. There are some severe memory requirements for the subcellular detail.

Interactive supercomputing is used for visualization, involving disks, persistent memory, and a cluster for

model building.

To reach a biological real time 1:1 result, weak scaling is perfect but strong scaling has its limits which means that there is still memory but no more spare CPUs. Eventually the system is running out of memory so, in fact, less processors are used for more memory.

Thomas Lippert from the Juelich Supercomputer Center talked about the Supercomputing Infrastructure Roadmap for the Human Brain Project.

The available supercomputers are the JuQUEEN BlueGene/Q, the BBP/CSCS research HPC system and the CADMOS 4-rack BlueGene/P.

If we consider the Human Brain Project from the neuromorphic perspective, the 4th generation simulation kernel for NEST scaling is used with the aim to generate full-scale models at a cellular and synaptic resolution with maximum-filling benchmarks. One percent of the Human Brain Project will be run on peta-scale computers

The NEST evolution is being covered over 15 years. Innovation is required since there is a need for a hierarchical memory concept for the Human Brain Project.

Thomas Lippert described the Human Brain Project platform architecture as follows:

  • CINECA capacity with massive data analytics
  • Cloud storage from KIT
  • Capacity for development at CSCS
  • Capacity at the Juelich Supercomputing Centre
  • Capacity at BSC for molecular dynamics

The project needs a global parallel file system and a high-speed network.

During the ramp-up phase between 2013 and 2016, systems are available at Juelich, CSCS, CINECA and BSC. Negotiations for access via and to PRACE are being started.

The operational phase covers 7,5 years within the Horizon 2020 Framework Programme. Within the operational phase, the first period is situated between 2016/17 and 2020/21.

The following architecture is needed in Phase I:

  • Human Brain Project supercomputer capable of 50PF, 50PB at Juelich
  • Human Brain Project development system at CSCS
  • Systems available at BSC, Cineca, GCS and CEA
  • Access to the PRACE ecosystem

Phase II stretches from 2021 and beyond in which the partners plan the following HPC capacities:

  • Human Brain Project exascale supercomputer at Juelich
  • Human Brain Project development system at CSCS
  • Systems available at BSC, Cineca, GCS, CEA
  • Access to the PRACE ecosystem

The partners are envisaging Human Brain Project innovation requirements versus the possible realization through pre-commercial procurement (PCP) for solution exploration, for prototyping, for test series and for commercial development over 42 months in total.

Mathematical models involve parallel programming models for interactive brain modelling and brain simulation (BSC) as well as work flow and distributed proramming.

The Human Brain Project also will make use of interactive visualization, as Thomas Lippert pointed out. The visualization system will include the following requirements:

  • visualization and analysis component execution framework at EPFL
  • neuroscience-specific visualization and interfaces at URJC
  • hardware technology, benchmarking and optimization for visualization and rendering

The exascale data management will involve scalable querying of peta to exascale datasets at EPFL.

More information is available at the Human Brain Project website.

Leslie Versweyveld