Back to Table of contents

Primeur weekly 2016-12-12

Crowd computing

New TN-Grid platform is hosting gene@home ...

Quantum computing

Further improvement of qubit lifetime for quantum computers ...

Focus on Europe

Cray works with Microsoft and CSCS to reach new performance milestone for deep learning at scale ...

PHENOMEN project to lay the foundations for a new age of information processing ...

Research into the theoretical bases of future wireless communications ...

Mont-Blanc project Event ARM: On the road to HPC ...

Middleware

Allinea webinar targets I/O optimization ...

Hardware

Mellanox's EDR 100Gb/s InfiniBand accelerates the largest National Institute of Health supercomputer ...

Mellanox announces record breaking performance enabling stateful packet processing at 400Gb/s with the NPS-400 network processor ...

Atos achieves SAP HANA certification for its bullion server operating up to 16TB of data ...

NVIDIA delivers AI supercomputer to Berkeley ...

New tender for the SURFnet8 service layer published ...

High performance graphene photodetectors set speed record ...

Applications

Scientists take 'blue-action' to help society cope with the impacts of Arctic climate changes ...

Eight new eScience projects to start in 2017 ...

University of Wyoming Faculty supercomputer use deadline is December 23, 2016 ...

Collaborating on Big Data to unravel disease processes ...

Weather the storm: Improving Great Lakes modelling ...

What to do with the data? ...

Big Data approach to water quality applied at shale drilling sites ...

The Cloud

Amazon Web Services Cloud now available to customers from data centres in Canada ...

NVIDIA delivers AI supercomputer to Berkeley


6 Dec 2016 Berkeley - NVIDIA CEO Jen-Hsun Huang earlier this year delivered the NVIDIA DGX-1 AI supercomputer in a box to the University of California, Berkeley's Berkeley AI Research Lab (BAIR). BAIR's over two dozen faculty and more than 100 graduate students are at the cutting edge of multi-modal deep learning, human-compatible AI and connecting AI with other scientific disciplines and the humanities.

"I'm delighted to deliver one of the first ones to you", Jen-Hsun Huang told a group of researchers at BAIR celebrating the arrival of their DGX-1.

The team at BAIR are working on a dazzling array of AI problems across a huge array of fields - and they're eager to experiment with as many different approaches as possible.

To do that, they need speed, explained Pieter Abbeel, an associate professor at UC Berkeley's Department of Electrical Engineering and Computer Science.

"More compute power directly translates into more ideas being investigated, tried out, tuned to actually get them to work", Pieter Abbeel stated. "So right now, an experiment might typically maybe take anywhere from a few hours to a couple of days, and so if we can get something like a 10-fold speed-up, that would narrow it down from that time to much shorter times - then we could right away try the next thing."

That speed - and the ability to manage huge quantities of data - is the key to new breakthroughs in deep learning, which, in turn, is key to helping computers navigate environments that people do every day, such as public roads, explained John Canny, the Paul and Stacy Jacobs Distinguished Professor of Engineering at UC Berkeley's Department of Electrical Engineering and Computer Science.

"In driving, drivers continue to improve over many years and decades because of the experience that they gain", John Canny stated. "In machine learning, deep learning currently doesn't really manage data sets of that size - so our interest is in collecting, processing and leveraging those very large data sets."

Cars that could learn not just from their own experiences - but from those of millions of other vehicles - promise to dramatically improve safety, explained Trevor Darnell, a professor in UC Berkeley's Department of Electrical Engineering and Computer Science.

"But that's just the tip of the iceberg", Trevor Darnell stated. "There will be also revolutions in transportation and logistics, the process of just moving stuff around - if you'd like to get a small package from here to there. If we could have autonomous vehicles of all sorts of sizes moving all of our goods and services around, I can't even speculate the degree of productivity that will give us."

Giving machines the ability to learn from their experience is also the key to helping robots move from factory floors to less predictable environments, such as our homes, offices and hospitals, Pieter Abbeel said.

"It's going to be important these robots can adapt to new situations they've never seen before", Pieter Abbeel stated. "The big challenge here is how to build an artificial intelligence that allows these robots to understand situations they’ve never seen before and still do the right thing."

While deep learning is already part of commonly used web services that help machines categorize information - such as speech and image recognition - Pieter Abbeel and his colleagues are exploring ways to help machines make decisions on their own.

Called "reinforcement learning", this new approach promises to help machines understand and navigate complex environments, Pieter Abbeel explained.

Building machines that can not only learn from their environment, but judge the risks that they're taking is key to building smarter robots, explained Sergey Levine, an assistant professor at the Department of Electrical Engineering and Computer Sciences at UC Berkeley.

Flying robots, for example, not only have to adapt to quickly changing environments, but have to be aware of the risks they're taking as they fly. "We use deep learning to build deep neural-network policies for flight that are aware of their own uncertainty so that they don't take actions for which they don't really understand the outcome", stated Sergey Levine.

New approaches such as this promise to help researchers build machines that are, ultimately, more helpful. The speed of DGX-1's GPUs and integrated software - and the connections between them - will help BAIR explore these new ideas faster than ever.

"There's somewhat of a linear connection between how much compute power one has and how many experiments one can run", Trevor Darnell stated. "And how many experiments one can run determines how much knowledge you can acquire or discover."
Source: NVIDIA

Back to Table of contents

Primeur weekly 2016-12-12

Crowd computing

New TN-Grid platform is hosting gene@home ...

Quantum computing

Further improvement of qubit lifetime for quantum computers ...

Focus on Europe

Cray works with Microsoft and CSCS to reach new performance milestone for deep learning at scale ...

PHENOMEN project to lay the foundations for a new age of information processing ...

Research into the theoretical bases of future wireless communications ...

Mont-Blanc project Event ARM: On the road to HPC ...

Middleware

Allinea webinar targets I/O optimization ...

Hardware

Mellanox's EDR 100Gb/s InfiniBand accelerates the largest National Institute of Health supercomputer ...

Mellanox announces record breaking performance enabling stateful packet processing at 400Gb/s with the NPS-400 network processor ...

Atos achieves SAP HANA certification for its bullion server operating up to 16TB of data ...

NVIDIA delivers AI supercomputer to Berkeley ...

New tender for the SURFnet8 service layer published ...

High performance graphene photodetectors set speed record ...

Applications

Scientists take 'blue-action' to help society cope with the impacts of Arctic climate changes ...

Eight new eScience projects to start in 2017 ...

University of Wyoming Faculty supercomputer use deadline is December 23, 2016 ...

Collaborating on Big Data to unravel disease processes ...

Weather the storm: Improving Great Lakes modelling ...

What to do with the data? ...

Big Data approach to water quality applied at shale drilling sites ...

The Cloud

Amazon Web Services Cloud now available to customers from data centres in Canada ...