Back to Table of contents

Primeur weekly 2017-02-20

Focus

HPC expert Genias Benelux to show its skillful expertise in brandnew website ...

Are billion Euro Flagships the right way to finance innovative areas like graphene, human brain research and quantum computing? ...

Exascale supercomputing

Advanced fusion code led by PPPL selected to participate in Early Science Programmes on three new DOE Office of Science pre-exascale supercomputers ...

Focus on Europe

From robotics to particle physics: Data analytics gets the spotlight in Distinguished Talk series at ISC 2017 ...

A new spin on electronics ...

Data mining tools for personalized cancer treatment ...

Why host HPC in Iceland to tackle Big Data for life sciences at Earlham Insititute ...

Biological experiments become transparent - anywhere, any time ...

Middleware

IBM delivers new platform to help clients address storage challenges at massive scale ...

Hewlett Packard Enterprise unveils most significant 3PAR Flash storage innovations to date ...

Hardware

Tokyo Institute of Technology partners with DDN on Tsubame3.0 to build forward-looking AI and Big Data computing infrastructure ...

Mellanox demonstrates four times improvement in crypto performance with Innova IPsec 40G Ethernet network adapter ...

Supermicro launches BigTwin - the industry's highest performing Twin multi-node system supporting the full range of CPUs, maximum memory and all-flash NVMe ...

Applications

Researchers catch extreme waves with higher-resolution modelling ...

Researchers are creating software to 'clean' large datasets, making it easier for scientists and the public to use Big Data ...

Designing new materials from 'small' data ...

Success by deception ...

DNA computer brings 'intelligent drugs' a step closer ...

'Lossless' metamaterial could boost efficiency of lasers and other light-based devices ...

Perimeter Institute researchers apply machine learning to condensed matter physics ...

When treating brain aneurysms, two isn't always better than one ...

Real-time MRI analysis powered by supercomputers ...

Analyzing data for transportation systems using TACC's Rustler, XSEDE ECSS support ...

NCSA facilitates performance comparisons with China's nr. 1 supercomputer ...

IBM delivers Watson for cyber security to power cognitive security operations centres ...

The Cloud

Optimizing data centre placement and network design to strengthen Cloud computing ...

Dutch start-up solution impacts data centres ...

OpenFog Consortium releases landmark reference architecture for Fog computing ...

IBM brings machine learning to the private Cloud ...

IBM accelerates hybrid Cloud adoption by enabling channel partners to offer VMware solutions ...

Oracle launches Cloud service to help organisations integrate disparate data and drive real-time analytics ...

Optimizing data centre placement and network design to strengthen Cloud computing

14 Feb 2017 Los Angeles - Telecommunication experts estimate the amount of data stored 'in the Cloud' or in remote data centres around the world, will quintuple in the next five years. Whether it's streaming video or business' database content drawn from distant servers, all of this data is - and will continue in the foreseeable future to be - accessed and transmitted by lasers sending pulses of light along long bundles of flexible optical fibers.

Traditionally, the rate information is transmitted does not consider the distance that data must travel, despite the fact that shorter distances can support higher rates. Yet as the traffic grows in volume and uses increasingly more of the available bandwidth, or capacity to transfer bits of data, researchers have become increasingly aware of some of the limitations of this mode of transmission.

New research from Nokia Bell Labs in Murray Hill, New Jersey may offer a way to capitalize on this notion and offer improved data transfer rates for Cloud computing based traffic. The results of this work will be presented at the Optical Fiber Communications Conference and Exhibition (OFC), held 19-23 March in Los Angeles, California, USA.

"The challenge for legacy systems that rely on fixed-rate transmission is that they lack flexibility", stated Dr. Kyle Guan, a research scientist at Nokia Bell Labs. "At shorter distances, it is possible to transmit data at much higher rates, but fixed-rate systems lack the capability to take advantage of that opportunity."

Dr. Guan worked with a newly emerged transmission technology called "distance-adaptive transmission", where the equipment that receives and transmits these light signals can change the rate of transmission depending on how far the data must travel. With this, he set about building a mathematical model to determine the optimal lay-out of network infrastructure for data transfer.

"The question that I wanted to answer was how to design a network that would allow for the most efficient flow of data traffic", stated Dr. Guan. "Specifically, in a continent-wide system, what would be the most effective [set of] locations for data centres and how should bandwidth be apportioned? It quickly became apparent that my model would have to reflect not just the flow of traffic between data centers and end users, but also the flow of traffic between data centres."

External industry research suggests that this second type of traffic, between the data centres, represents about one-third of total cloud traffic. It includes activities such as data back-up and load balancing, whereby tasks are completed by multiple servers to maximize application performance.

After accounting for these factors, Guan ran simulations with his model of how data traffic would flow most effectively in a network.

"My preliminary results showed that in a continental-scale network with optimized data centre placement and bandwidth allocation, distance-adaptive transmission can use 50 percent less wavelength resources or light transmission, and reception equipment, compared to fixed-rate rate transmission", stated Dr. Guan. "On a functional level, this could allow cloud service providers to significantly increase the volume of traffic supported on the existing fiber-optic network with the same wavelength resources."

Dr. Guan recognizes other important issues related to data center placement. "Other important factors that have to be considered include the proximity of data centres to renewable sources of energy that can power them, and latency - the interval of time that passes from when an end user or data centre initiates an action and when they receive a response", he stated.

Dr. Guan's future research will involve integrating these types of factors into his model so that he can run simulations that even more closely mirror the complexity of real-world conditions.

Source: The Optical Society

Back to Table of contents

Primeur weekly 2017-02-20

Focus

HPC expert Genias Benelux to show its skillful expertise in brandnew website ...

Are billion Euro Flagships the right way to finance innovative areas like graphene, human brain research and quantum computing? ...

Exascale supercomputing

Advanced fusion code led by PPPL selected to participate in Early Science Programmes on three new DOE Office of Science pre-exascale supercomputers ...

Focus on Europe

From robotics to particle physics: Data analytics gets the spotlight in Distinguished Talk series at ISC 2017 ...

A new spin on electronics ...

Data mining tools for personalized cancer treatment ...

Why host HPC in Iceland to tackle Big Data for life sciences at Earlham Insititute ...

Biological experiments become transparent - anywhere, any time ...

Middleware

IBM delivers new platform to help clients address storage challenges at massive scale ...

Hewlett Packard Enterprise unveils most significant 3PAR Flash storage innovations to date ...

Hardware

Tokyo Institute of Technology partners with DDN on Tsubame3.0 to build forward-looking AI and Big Data computing infrastructure ...

Mellanox demonstrates four times improvement in crypto performance with Innova IPsec 40G Ethernet network adapter ...

Supermicro launches BigTwin - the industry's highest performing Twin multi-node system supporting the full range of CPUs, maximum memory and all-flash NVMe ...

Applications

Researchers catch extreme waves with higher-resolution modelling ...

Researchers are creating software to 'clean' large datasets, making it easier for scientists and the public to use Big Data ...

Designing new materials from 'small' data ...

Success by deception ...

DNA computer brings 'intelligent drugs' a step closer ...

'Lossless' metamaterial could boost efficiency of lasers and other light-based devices ...

Perimeter Institute researchers apply machine learning to condensed matter physics ...

When treating brain aneurysms, two isn't always better than one ...

Real-time MRI analysis powered by supercomputers ...

Analyzing data for transportation systems using TACC's Rustler, XSEDE ECSS support ...

NCSA facilitates performance comparisons with China's nr. 1 supercomputer ...

IBM delivers Watson for cyber security to power cognitive security operations centres ...

The Cloud

Optimizing data centre placement and network design to strengthen Cloud computing ...

Dutch start-up solution impacts data centres ...

OpenFog Consortium releases landmark reference architecture for Fog computing ...

IBM brings machine learning to the private Cloud ...

IBM accelerates hybrid Cloud adoption by enabling channel partners to offer VMware solutions ...

Oracle launches Cloud service to help organisations integrate disparate data and drive real-time analytics ...