Back to Table of contents

Primeur weekly 2016-02-08

Focus

Reaching towards fully-integrated 3D tomography using new generation of algorithms and high performant scanning equipment and GPU clusters ...

Whole genome sequencing using HPC clusters in project MinE to find a cure for ALS ...

Exascale supercomputing

How to fit ten million computers into a single Supercomputer? The ExaNeSt project paves the way. European consortium becomes the trailblazer in the development of the most challenging architectures ...

Focus on Europe

Europe and Brazil join forces to improve the efficiency of the energy sector with HPC ...

Upcoming e-IRG Workshop will focus on the Progress of the e-Infrastructure Commons ...

CCRT boosts industrial innovation with a petascale supercomputer from Bull ...

Middleware

Ellexus launches tool to load balance shared storage called Mistral ...

Frame unveils powerful new visual supercomputer in the Cloud ...

Hardware

Poznan Supercomputing & Networking Center and Huawei launch HPC cluster, liquid cooled by CoolIT Systems ...

DDN WOS object storage software wins 2016 Storage Visions Award for secure, collaborative Cloud storage ...

Auburn University unveils $1 million supercomputer and initiates new technology acquisition plan ...

AMD reveals world's first hardware-virtualized GPU product line ...

High Performance Computing (HPC) market worth 36.62 billion USD by 2020 ...

Mellanox promotes Amit Katz to Vice President of Ethernet Switch Sales ...

Applications

Ofiice of Naval Research-sponsored technology simulates how legs bleed ...

Anton 2 supercomputer at PSC will increase speed and size of molecular simulations ...

NCSA awarded grant to examine effective practices in industrial HPC ...

A stellar collaboration: Supercomputing and NASA's IRIS Observatory ...

Carnegie Mellon joins IARPA project to reverse-engineer brain algorithms ...

Bionik Laboratories to utilize machine learning and analytics to improve neurological rehabilitation ...

IBM Watson Ecosystem opens for business in India ...

American Heart Association, IBM Watson Health and Welltok team up to transform heart health ...

The Cloud

Nimbix addresses enterprise call for high-performance computing with executive sales hire ...

IBM launches Cloud data and analytics marketplace for developers ...

IBM closes deal to acquire The Weather Company's product and technology businesses ...

Reaching towards fully-integrated 3D tomography using new generation of algorithms and high performant scanning equipment and GPU clusters


15 Dec 2015 Amsterdam - During the SURFsara Super D Event, recently held at Felix Meritis in Amsterdam, The Netherlands, Primeur Magazine had the opportunity to talk with Joost Batenburg from the Centrum Wiskunde & Informatica - CWI about 3D tomography. Joost Batenburg is a researcher a CWI in Amsterdam, the Netherlands national Research Institute for Mathematics and Computer Science and a part-time professor at the Mathematical Institute of the University of Leiden. Joost Batenburg's research domain is tomography, a form of 3D image reconstruction from projections. The most well-known example of tomography is a medical CT scanner where X-ray images are recorded from a large number of angles, all around the patient. The technique of tomography is to take as input the series of X-ray images and then compute a 3D image of what the patients look like at the inside. You can use this technique not only for medical imaging but also to look inside nanostructures from images using an electron microscope, for non-destructive testing, to look into a variety of industrial objects, and for a broad range of other applications.

All that we can observe using scanning instruments are projection images. You see the object just from one angle. That gives you only limited information about the 3D structure. To see really in 3D your object, you need to compute from all these images, from all the angles what the object looks like. How to do this has actually been known since the nineteen sixties and seventies when people first started developing CT scanners. Researchers use very basic algorithms based on the so-called Radon inversion formula. They are computationally efficient so people could use them already at that time but their main drawback is that they need a lot of information. You need to take images from the full range of angles and they have to be of very high quality. That can be a big problem, for example, due to the dose limitations. X-rays are harmful and you don't want to use too much of it.

If you start taking less and less images to use less radiation, you don't have sufficient information to compute an accurate 3D object purely from the data. The way to deal with this - and this is what researchers are doing at the moment - is to incorporate additional knowledge about the object of your imaging. In a medical case, for example, you know a patient is built up out of soft tissue, bone and some other densities but you know that it does not have aluminium in it, for instance. By using this prior knowledge and building it into the algorithm that computes the images you can get high-quality images from just few projections, from few measurements but at the expense of a lot of computation time because these new algorithms are far more computationally intensive compared to the classical ones.

At the moment, Joost Batenburg and his colleagues are making heavy use of GPU computing, so computing on the graphics processor. For medium-scale volumes, consisting of 500 by 500 cubic voxels, this is still sufficient. With a workstation using one or two powerful GPUs you can compute these images in a matter of minutes. If the researchers are dealing with very large datasets, for example, coming out of big scanning institutes, the researchers have to resort to cluster computing with many GPUs in order to do the computation in reasonable time.

The model the researchers are using at the moment is a kind of processing: they take the image and do the calculation afterwards. Joost Batenburg's vision for the future is that tomographic 3D scanning becomes an interactive science. Right at the moment that the researchers are scanning the object they try to compute the 3D image and have the data of the 3D image immediately available, visualized and analyzed, such that the user can in fact interact with the scanning process. This is particularly important for in situ scanning applications such as doing experiments like foam formation, bubble tracking inside the tomographic scanner. You have an evolving object and you want to constantly keep track of what is happening inside the scanner in full 3D.

To make this possible, first of all it is necessary to develop a new generation of algorithms because the current algorithms are not sufficient. Secondly, researchers have to stop considering that a scanner is an instrument separate from computing. Researchers need to integrate it so they have the scanner on one hand, a high performance cluster, maybe consisting of one hundred nodes and lots of GPU on the other hand, fully connected with a high speed connection and have that all available in one facility.

At the moment, Joost Batenburg and his colleagues have a pretty good grip on how to do the algorithms for that. He expects that he and his colleagues need at least another five years of research to turn this into a practical proof of concept.

Ad Emmen

Back to Table of contents

Primeur weekly 2016-02-08

Focus

Reaching towards fully-integrated 3D tomography using new generation of algorithms and high performant scanning equipment and GPU clusters ...

Whole genome sequencing using HPC clusters in project MinE to find a cure for ALS ...

Exascale supercomputing

How to fit ten million computers into a single Supercomputer? The ExaNeSt project paves the way. European consortium becomes the trailblazer in the development of the most challenging architectures ...

Focus on Europe

Europe and Brazil join forces to improve the efficiency of the energy sector with HPC ...

Upcoming e-IRG Workshop will focus on the Progress of the e-Infrastructure Commons ...

CCRT boosts industrial innovation with a petascale supercomputer from Bull ...

Middleware

Ellexus launches tool to load balance shared storage called Mistral ...

Frame unveils powerful new visual supercomputer in the Cloud ...

Hardware

Poznan Supercomputing & Networking Center and Huawei launch HPC cluster, liquid cooled by CoolIT Systems ...

DDN WOS object storage software wins 2016 Storage Visions Award for secure, collaborative Cloud storage ...

Auburn University unveils $1 million supercomputer and initiates new technology acquisition plan ...

AMD reveals world's first hardware-virtualized GPU product line ...

High Performance Computing (HPC) market worth 36.62 billion USD by 2020 ...

Mellanox promotes Amit Katz to Vice President of Ethernet Switch Sales ...

Applications

Ofiice of Naval Research-sponsored technology simulates how legs bleed ...

Anton 2 supercomputer at PSC will increase speed and size of molecular simulations ...

NCSA awarded grant to examine effective practices in industrial HPC ...

A stellar collaboration: Supercomputing and NASA's IRIS Observatory ...

Carnegie Mellon joins IARPA project to reverse-engineer brain algorithms ...

Bionik Laboratories to utilize machine learning and analytics to improve neurological rehabilitation ...

IBM Watson Ecosystem opens for business in India ...

American Heart Association, IBM Watson Health and Welltok team up to transform heart health ...

The Cloud

Nimbix addresses enterprise call for high-performance computing with executive sales hire ...

IBM launches Cloud data and analytics marketplace for developers ...

IBM closes deal to acquire The Weather Company's product and technology businesses ...