Back to Table of contents

Primeur weekly 2015-10-26

Special

DZ Bank believes in Cloud computing but has to move with caution ...

Peter Coveney's views on the Virtual Human, the Human Brain Project, the medical practice and curriculum, and the interrelation of supercomputing and data storage ...

Focus

Exascale: not simply a matter of re-writing the applications and more parallelism ...

Exascale supercomputing

New report on energy-efficient computing ...

Photons open the gateway for quantum networks ...

Quantum computing

Loophole-free Bell test TU Delft crowns 80-years-old debate on nature of reality: Einsteins "spooky action" is real ...

Focus on Europe

IBM, UK STFC Hartree Centre, NVIDIA and Mellanox launch high performance computing POWER Acceleration and Design Center for business ...

Middleware

Bright Computing's Panos Labropoulos to speak at Amsterdam Spark Summit ...

Rogue Wave Software acquires enterprise PHP expert Zend ...

SGI exceeds 200 Terabyte milestone in total systems running SAP HANA ...

Hardware

Nor-Tech innovates ruggedized portable custom supercomputer for CAE and other applications ...

Comet: A supercomputer for the 'long tail' of science ...

U.S. Army unveils Excalibur, one of the world's top 20 supercomputers ...

Supermicro's new X11 UP solutions with highest performance and efficiency support latest Intel Xeon processor E3-1200 v5 family ...

New custom-designed storage system by DDN boosts Intel Omni-Path fabric speed in HPC, web and Cloud workflows by 100 percent ...

Department of Energy’s National Nuclear Security Administration selects Penguin Computing's Tundra extreme scale solution for National Labs ...

Applications

IBM expands data discovery and Q&A power of Watson Analytics ...

Cray's high-density GPU system sets new performance standard for ECHELON reservoir simulation software ...

New SDSC award provides easy path to supercomputing for neuroscientists ...

UC San Diego/SDSC study uncovers mechanism to block a cancer pathway ...

Utah researchers develop software to better understand brain's network of neurons ...

DrugDiscovery@TACC portal helps researchers worldwide with drug therapeutics ...

Supercomputing helps ecologists overturn popular theory ...

The Cloud

EMC and VMware reveal new Cloud services business ...

Dell introduces industry-first hybrid Cloud system with bold payment structure to simplify and minimize risks of Cloud adoption ...

Peter Coveney's views on the Virtual Human, the Human Brain Project, the medical practice and curriculum, and the interrelation of supercomputing and data storage

29 Sep 2015 Frankfurt - At the recent ISC Cloud and Big Data Conference in Frankfurt,Primeur Magazinehad the opportunity to interview Peter Coveney from the University College London about the Virtual Human and the technologies involved with the simulation of the human physiological processes. The main message behind Peter Coveney's keynote presentation at the conference was answering to the need to perform very high fidelity reliable predictions from modelling and simulation. These methodologies require high performance computing or they can't even get out of the starting blocks.

The initiative from the European Union in the Framework Programme 7 was called the Virtual Physiological Human. The main purpose was to introduce and demonstrate the power of modelling and simulation technologies that come out of physical science and engineering and how they can be applied in the medical context to support clinical decision making.

If you are going to do that kind of work, this obviously involves three dimensions in space and time. Anyone who performs modelling and simulation realizes for the most part such goals do require high performance computing in any domain. The additional problems that you face in helping to support biomedical research relate on the one hand to the fact that we are dealing with patient data which comes with privacy and confidentiality issues that are common in many areas. On the other hand, you need to produce outcomes or predictions on time scales that are relevant to a clinical decision. These can be very short by comparison with what people are normally working with. This is a big computational science challenge in its own way.

Primeur Magazine:So you need to have almost real time computing to get the right decision or the right modelling results?

Peter Coveney:There are so many ways and areas in which computational methods can support decision making in a medical context. Some of them are in the category you just described. In my talk, I spoke about supporting interventions that are being made in brain surgery - it could apply to cardiac surgery as well - where somebody is faced with a life or death situation. Usually, the clinicians have to trust their judgment with the experience they've already got. The purpose of these technologies is to support that decision making to enhance the outcome. That might be in some cases life or death or being that the outcome is much better than it would have been previously.

There are also other applications where it is certainly very important to get results. This has to do with patient drug targeting and selection. We need to get outcomes there in the space of one or two days. It is not as real time, so to speak, as the other application but it is certainly much shorter than people are usually familiar with in these areas.

Primeur Magazine:Are there also things you can do with Cloud computing?

Peter Coveney:The Cloud itself is a moving concept with what resources it has available to use as a function of time because computing power increases as we just stand around here. The main point though is that in order to get these results, these very detailed high-fidelity models in a very short space of time, this typically does require what we call the class of high performance computing applications. It means that I am going to be running maybe a single job or a large number of jobs that may require tens of thousands of cores. These cores need to be tightly coupled together. This is not what a conventional Cloud today can provide. You really high performance computing.

If it is going to be used in a real context in a hospital in the future - I don't know any hospital that really has a kind of high performance computer today - it is a question of whether they will end up buying their own capabilities or pooling resources and sharing these on some firewalls - or should we say private Cloud - that would still support a very large HPC device - these things are all potentially possible.

Primeur Magazine:Is there something going on today which is similar to the Virtual Human Project?

Peter Coveney:There is a Virtual Human activity of a sort in the USA but probably the project which is most similar and has the highest visibility right now is the Human Brain Project that is also EU-funded and started at the tail end of the Framework Programme 7 and is now entering into Horizon 2020. This project in a sense is predicated on the idea that when you get an exascale machine - we do not have them yet but we might see them in the early 2020's - it may be possible, according to some people, to build a sufficiently high-fidelity representation of the human brain and that we will be able to better understand things like thought. I don't want to say 'consciousness' because that itself is still a deep philosophical issue.

Primeur Magazine:The European Commission has just announced a preliminary version of the workprogramme for 2016-2017, in which they allocate about 25 million euro for a supercomputer dedicated to the human brain study. Probably, that is because one supercomputer is not the same as the other one, it needs to be specialized.

Peter Coveney:Right. In the end, when you come down to any particular application - I mean this in a technical sense of a code or a programme you want to run repeatedly in the most optimal form possible because you understand all of the physics underlying it - you have a strong argument for investing in a very specific technology because you know it is going to deliver for you. If you are still doing feasibility studies, the absolutely highest performance may not be so crucial. In weather forecasting, most of the meteorological offices own their own computers. They choose them because they are optimized for their own purposes.

Primeur Magazine:You also mentioned that you are participating in e-infrastructures like EUDAT. What is the goal of being there?

Peter Coveney:Because our work involves high performance computers and those are distributed. Nationally, I come from the UK but across Europe, we have powerful machines in many European countries and we need access to those. It is a distributed nature to the infrastructure and the computers need data to be fed into them to start the simulations and more particularly, they generate a lot of data when you run jobs. That data has to be stored. At the scale of a very big machine those data sizes can be easily many terabytes, getting up to petabytes. This is not easy to move around. You also want a facility somewhere, which could be local to those resources that stores the data. You now have data infrastructures that are being put up, as well as supercomputing ones.

Ultimately, I'd like to see those things interrelated much more closely than they currently are. From our point of view, we want them to be seamless, not separated with different goals and management structures. The question of the uptake of technologies has been called that of the sort we were describing for the Virtual Human. This is fundamentally meant to be supporting medical and clinical decision making. There is still a large barrier there in terms of education and training of the medics who in the end need to champion these methods. That is another big issue.

Primeur Magazine:How good all the simulations are, also depends on whether they are used in real life by doctors and other medical practitioners. What is your opinion about that?

Peter Coveney:The first part of the question requires that we have to do a lot of work - much more than we would normally do on verification and validation - to be able to be quite sure the predictions are robust and don't depend on the particular circumstances of one person who ran a simulation. That is also an essential component. When these things have been achieved and you want to pass the technology on to the medical profession, they have to understand what's available. They need to know more than they currently do on their education basis. There is a strong view growing that they need to develop their curriculum to pay more attention to the role of modelling information technology and indeed high performance computing than they have ever done before.

Here is a big programme. You cannot address and solve the Virtual Human, even in a 7-year initiative. It is a huge agenda for the 21st century. To be succesful it really has to be invested in for the long term and it is not clear how that is going to happen.

Primeur Magazine:How will this technology develop?

Peter Coveney:At the moment, we have reached the stage where certain medical procedures are being supplemented by this technology in a mode of demonstration. We are really at the very beginning like the threshold of a new way of doing biomedicine. It is something that requires the rest of the century before we see all the fruits of these endeavours because the human body is immensely complicated. In the short run, we can pick up some early results, you could say low hanging fruit. In the long run, you have to integrate all of this data and models. This is just an activity that is going to go on for decades.

Ad Emmen

Back to Table of contents

Primeur weekly 2015-10-26

Special

DZ Bank believes in Cloud computing but has to move with caution ...

Peter Coveney's views on the Virtual Human, the Human Brain Project, the medical practice and curriculum, and the interrelation of supercomputing and data storage ...

Focus

Exascale: not simply a matter of re-writing the applications and more parallelism ...

Exascale supercomputing

New report on energy-efficient computing ...

Photons open the gateway for quantum networks ...

Quantum computing

Loophole-free Bell test TU Delft crowns 80-years-old debate on nature of reality: Einsteins "spooky action" is real ...

Focus on Europe

IBM, UK STFC Hartree Centre, NVIDIA and Mellanox launch high performance computing POWER Acceleration and Design Center for business ...

Middleware

Bright Computing's Panos Labropoulos to speak at Amsterdam Spark Summit ...

Rogue Wave Software acquires enterprise PHP expert Zend ...

SGI exceeds 200 Terabyte milestone in total systems running SAP HANA ...

Hardware

Nor-Tech innovates ruggedized portable custom supercomputer for CAE and other applications ...

Comet: A supercomputer for the 'long tail' of science ...

U.S. Army unveils Excalibur, one of the world's top 20 supercomputers ...

Supermicro's new X11 UP solutions with highest performance and efficiency support latest Intel Xeon processor E3-1200 v5 family ...

New custom-designed storage system by DDN boosts Intel Omni-Path fabric speed in HPC, web and Cloud workflows by 100 percent ...

Department of Energy’s National Nuclear Security Administration selects Penguin Computing's Tundra extreme scale solution for National Labs ...

Applications

IBM expands data discovery and Q&A power of Watson Analytics ...

Cray's high-density GPU system sets new performance standard for ECHELON reservoir simulation software ...

New SDSC award provides easy path to supercomputing for neuroscientists ...

UC San Diego/SDSC study uncovers mechanism to block a cancer pathway ...

Utah researchers develop software to better understand brain's network of neurons ...

DrugDiscovery@TACC portal helps researchers worldwide with drug therapeutics ...

Supercomputing helps ecologists overturn popular theory ...

The Cloud

EMC and VMware reveal new Cloud services business ...

Dell introduces industry-first hybrid Cloud system with bold payment structure to simplify and minimize risks of Cloud adoption ...