Back to Table of contents

Primeur weekly 2013-05-21

Exascale supercomputing

Fujitsu receives supercomputer order from Nagoya University ...

The Cloud

Bright Computing named finalist for the 2013 Red Herring Top 100 North America award ...

Fujitsu launches Fujitsu Cloud Initiative to systematize its Cloud products and services ...

HP speeds delivery of IT application services ...

New IBM Center in Beijing to speed Linux applications on Power Systems ...

University of Chicago launches Bionimbus Protected Data Cloud to analyze cancer data ...

EuroFlash

Research assessment should take data sharing into account - workshop report ...

ACE's latest CoSy compiler development system optimizes for extreme architectures ...

Dassault Systèmes acquires SIMPOE ...

Is it raining in Amsterdam? Use Big Data and HPC to analyse Dutch Twitter feeds ...

PRACE 5th Industrial Executive Seminar HPC changing Europe's industrial landscape ...

Places2Be project to boost European leadership around FD-SOI - the faster, cooler, simpler chip technology ...

Durham University cluster boosts study of the stars ...

UCB and IBM collaborate to personalize care for epilepsy patients ...

USFlash

Small particles require big computers ...

GM's new enterprise data centre transforms global IT ...

SDSC assists in generating Clean Tech breakthrough ...

NYSERDA awards $1.8 million to Rensselaer Polytechnic Institute to boost efficiency of supercomputing centre ...

TurningPoint/EMW selected as DISA GSM ETI prime contract holder ...

XSEDE13 Conference to devote full day to biosciences ...

D-Wave Two Quantum Computer selected for new Quantum Artificial Intelligence Initiative, system to be installed at NASA's Ames Research Center, and operational in Q3 ...

DOST strengthens R&D capabilities with IBM supercomputer - IBM-s Blue Gene to enable improved weather prediction ...

Mines' powerful new supercomputer to focus on energy research ...

NICE: the brain as a model for future supercomputers ...

NJIT computer scientist publishes new algorithm cluster to data mine health records ...

NICE: the brain as a model for future supercomputers

14 May 2013 Albuquerque - The brain's repute took a big hit in 1997 when an IBM supercomputer defeated world chess champion Gary Kasparov in a match reported around the world. But in the second round, the brain is back. A Sandia National Laboratories-supported workshop in Albuquerque called NICE, for Neuro-Inspired Computational Elements workshop, discussed ways to use the brain's superior ability to send electrical signals along massively parallel channels, with multiple intersections at downstream nodes, to handle rapidly changing, high-volume information.

The hope is that rather than using the limited "if this, then that" logic of conventional computer architectures to absorb steadily increasing yet often incomplete data, cognitive systems will be able - like the brain - to learn, adapt, hypothesize, and then suggest answers.

As Julia Phillips, Sandia vice president and chief technology officer, put it in her opening talk: "Neuro-inspired computing is at the intersection of cognitive science and technology, nano devices, microsystems and computer and information sciences. It transcends our traditional approaches." It also happens to reside at the major crossroads of Sandia research areas, she pointed out.

Of course, conventional computer architectures still predominate and Moore's Law isn't dead yet - just "eroding", as Sandia director of computing research Rob Leland told the workshop. But when it becomes impossible to shrink circuits any smaller, as it seems will be the case in the next 10 years - what's next? And as the von Neumann/Turing architecture of the last 60 years staggers beneath the weight of uncertainties increasingly inherent in working with huge realms of fuzzy data, what then?

Workshop participants proposed using the configuration of the brain as a model. First, isolate the brain tissues that control aspects of behavior. Then analyze - microscopically and in very small time steps - the shape and behaviour of the neurons sending the signals. Then duplicate that arrangement using conventional hardware and software, or most likely, a new solid-state substrate.

"National security challenges - Sandia's main interest - have historically been addressed in the physical domain, which remains vitally important", Rob Leland stated. "But these challenges today have intrinsically a cognitive aspect concerning the behaviour of the individual and group, so just the physical realm isn't going to be sufficient to address these issues. Our aspiration is to deepen our understanding of cognitive science so we can address these problems in the behavioural realms." He listed possible domain intersections that included tissue-based and in-vivo sensors, optical nanosensors for chemical analysis within cells, regulated nano-assembly of circuits, digital antibodies and virus-sized logic chips.

Jim Olds from the Krasnow Institute at George Mason University went further in not only predicting the end of Moore's Law but denying it ever had the importance the computing world assigned it. He presented what he called "the great stagnation argument: that Moore's Law is not like the industrial revolution or electricity" because it produced few jobs and lately, no real economic growth.

"There's been a slowed-down technological revolution, despite our feelings to the contrary", he stated. Because Facebook, "for all its enormous market capitalization", and Google have few employees compared with Ford Motor Co., "it's clear that technology from Moore's Law isn't translated into day-to-day lives. For some reason, we're not seeing opportunities for getting ahead by hard work. It's enabled us to enjoy leisure, and load movies onto iPads, but flying cars haven't come to pass."

To the contrary, he said, real median household income, which increased dramatically since the beginning of the 20th century, stopped increasing in the last 10 years. To solve this problem so "researchers are not sitting alone in their silos ... we need a new, brain-inspired industrial revolution", Jim Olds stated.

That might be found in the Obama administration's recently announced project to map the neurons and network functions of the human brain. The $100 million project, which received a mixed reception from neuroscientists, will launch in 2014 and may continue for 10 years.

"This is a transformation from letting a million flowers bloom - from single PIs (principal investigators) to a major strategic investment", Jim Olds stated.

"Brains are highly parallel, can reconfigure themselves dynamically in a few minutes and use molecular signal transduction (to pass messages)", he stated. "In message-passing they use little power and finesse around bottlenecks (that would slow silicon) parallel computing systems."

Apparently, though, the brain's advantage isn't speed. The brain uses wet-ware, Jim Olds said, and is therefore slow compared to the speed of silicon chips, though more complex and therefore more powerful in many other ways.

Slow signal speed didn't faze Christof Koch, chief scientific officer of Allen Institute for Brain Science. "I have a modest proposal", he told the group. "Imagine a 1-kilogram, three-dimensional block of silicon, or stacks of chips, all with 10 kilohertz clocks and each consuming microwatts of power. There's much more silicon, and therefore it's very expensive and heavy, like the brain. But, much less cost for heat sinks, much less air conditioning."

The Allen Institute, he said, was founded in 2003 to support basic research in the brain sciences with a staff of 210, including 50 Ph.D.s.

"There are a thousand different cell types in the brain", Christof Koch stated. "Every time we look at the brain, we see more and more complexities, like astronomers looking at the universe every ten years."

The problems include science's inability to simultaneously record more than 0.0001 percent of firing neurons, and, before the Obama proposal, "no central unifying projects. There are 10,000 labs with different questions, methods, protocols and standards, heading off exuberantly in all directions. Universities are not set up for large-scale systematic efforts."

Jacob Vogelstein, a programme manager at Johns Hopkins' Applied Physics Laboratory, spoke about moving ideas into practical engineering. He described taking slices of mouse brain 2 to 3 millimeters on a side and 49 nanometers thick. "Line them up on top of each other and extract the (neuronal) network", he stated. Inputs and outputs can be simulated with Monte Carlo techniques that allow for randomness.

Again, the difficulties could not be minimized. "In a tiny (brain) region, there are 25,000,000 synapses and cell bodies working through dendrites and axons", Jacob Vogelstein said about the difficulties of creating a copy that might serve as a computing template.

Of course, there is always the question of whether the brain provides the right model, cautioned Mike Vahle, Sandia's chief information officer. "Computer problems are taking characteristics that the brain seems particularly well-suited to handle", he stated. "But is pattern-matching the right paradigm? Is the technology attainable, are the ethical and cultural issues understood? Can we avoid the pitfalls that plague modern computers and networks: viruses, worms, hacking and computer security (problems in general)?"

Murat Okandan, who proposed and helped organize the workshop for Sandia, suggested the brain did indeed show the path for dealing with large, incomplete, noisy data sets. "First we'll work with conventional CMOS devices and tools, with simulations of conventional system and architectures, and we'll cross-pollinate. The ultimate goal would be to learn from the motifs we see in neural computation and instantiate that capability in a massively interconnected, self-reconfigurable substrate that natively does the computation. The question will always be, how much fidelity do you need to get the functionality you want?"

"It's national reinvention: Time to lead again", stated Jim Olds. He prophesized that the brain's secrets, morphed into new computers, would "enhance the range of productivity to include retirement years; increase levels of safety and security so that normal decline of physical and mental abilities are lessened; improve method of wealth development leveraging Moore's law. And help develop enhanced modelling of societies to keep life meaningful."

"To do that, we need to prime the pipeline with the right kind of folks: a transdisciplinary scientist that enhances 'team science' approaches", he stated.
Source: Sandia National Laboratories

Back to Table of contents

Primeur weekly 2013-05-21

Exascale supercomputing

Fujitsu receives supercomputer order from Nagoya University ...

The Cloud

Bright Computing named finalist for the 2013 Red Herring Top 100 North America award ...

Fujitsu launches Fujitsu Cloud Initiative to systematize its Cloud products and services ...

HP speeds delivery of IT application services ...

New IBM Center in Beijing to speed Linux applications on Power Systems ...

University of Chicago launches Bionimbus Protected Data Cloud to analyze cancer data ...

EuroFlash

Research assessment should take data sharing into account - workshop report ...

ACE's latest CoSy compiler development system optimizes for extreme architectures ...

Dassault Systèmes acquires SIMPOE ...

Is it raining in Amsterdam? Use Big Data and HPC to analyse Dutch Twitter feeds ...

PRACE 5th Industrial Executive Seminar HPC changing Europe's industrial landscape ...

Places2Be project to boost European leadership around FD-SOI - the faster, cooler, simpler chip technology ...

Durham University cluster boosts study of the stars ...

UCB and IBM collaborate to personalize care for epilepsy patients ...

USFlash

Small particles require big computers ...

GM's new enterprise data centre transforms global IT ...

SDSC assists in generating Clean Tech breakthrough ...

NYSERDA awards $1.8 million to Rensselaer Polytechnic Institute to boost efficiency of supercomputing centre ...

TurningPoint/EMW selected as DISA GSM ETI prime contract holder ...

XSEDE13 Conference to devote full day to biosciences ...

D-Wave Two Quantum Computer selected for new Quantum Artificial Intelligence Initiative, system to be installed at NASA's Ames Research Center, and operational in Q3 ...

DOST strengthens R&D capabilities with IBM supercomputer - IBM-s Blue Gene to enable improved weather prediction ...

Mines' powerful new supercomputer to focus on energy research ...

NICE: the brain as a model for future supercomputers ...

NJIT computer scientist publishes new algorithm cluster to data mine health records ...