Back to Table of contents

Primeur weekly 2014-02-24

Special

H2020: the long road to an integrated open and accessible European e-Infrastructure ...

PRACE, HPC applications and technological development: three ingredients for a top European strategy ...

Yannick Legré is the new director of EGI.eu ...

The Cloud

Red Hat Enteprise Linux OpenStack Platform leveraged by Alcatel-Lucent, CloudBand as part of its Network Functions Virtualization (NFV) Platform ...

AT&T and IBM join forces to deliver new innovations for the Internet of Things ...

Mellanox introduces CloudX Platform to enable companies to build the most efficient public, private and hybrid Clouds ...

EuroFlash

Powerful supercomputer to offer a glimpse of the early universe ...

From a distance: New technique for repair work ...

ECRIN-ERIC to host inauguration ceremony ...

Karlsruhe Institute of Technology to develop ultra-small and ultra–fast electro-optic modulator ...

SURFsara to host Data & Computing Infrastructure Event on 12-13 March 2014 ...

USFlash

SDSC team develops multi-scale simulation software for chemistry research ...

SDSC/UC San Diego researchers hone in on Alzheimer's disease ...

Intel advances next phase of Big Data intelligence: real time analytics ...

Supercomputer dramatically accelerates rapid genome analysis ...

Using computers to speed up drug discovery ...

Better cache management could improve chip performance and cut energy use ...

A step closer to a photonic future ...

HP delivers record-breaking performance and dramatic efficiencies with HP ProLiant servers ...

Researchers propose a better way to make sense of 'Big Data' ...

Mega-bucks from Russia seed development of 'Big Data' tools ...

A new laser for a faster Internet ...

C-DAC to organize Accelerating Biology 2014: Computing Life ...

NetApp introduces unified scale-out storage systems and virtualization software for the unbound Cloud era ...

Supermicro shipping 96 DIMM 4U 4-Way SuperServer featuring new Intel Xeon processor E7-8800/4800 v2 ...

Researchers propose a better way to make sense of 'Big Data'


extradeda/Shutterstock
18 Feb 2014 Cold Spring Harbor - Big Data is everywhere, and we are constantly told that it holds the answers to almost any problem we want to solve. Companies collect information on how we shop, doctors and insurance companies gather our medical test results, and governments compile logs of our phone calls and e-mails. In each instance, the hope is that critical insights are hidden deep within massive amounts of information, just waiting to be discovered.

But simply having lots of data is not the same as understanding it. Increasingly, new mathematical tools are needed to extract meaning from enormous data sets. In work published on-line, two researchers at Cold Spring Harbor Laboratory (CSHL) now challenge the most recent advances in this field, using a classic mathematical concept to tackle the outstanding problems in Big Data analysis.

What does it mean to analyze Big Data? A major goal is to find patterns between seemingly unrelated quantities, such as income and cancer rates. Many of the most common statistical tools are only able to detect patterns if the researcher has some expectation about the relationship between the quantities. Part of the lure of Big Data is that it may reveal entirely new, unexpected patterns. Therefore, scientists and researchers have worked to develop statistical methods that will uncover these novel relationships.

In 2011, a distinguished group of researchers from Harvard University published a highly influential paper in the journalSciencethat advanced just such a tool. But in a paper published inProceedings of the National Academy of Sciences, CSHL Quantitative Biology Fellow Justin Kinney and CSHL Assistant Professor Gurinder "Mickey" Atwal demonstrate that this new tool is critically flawed. "Their statistical tool does not have the mathematical properties that were claimed", stated Justin Kinney.

Justin Kinney and Gurinder Atwal show that the correct tool was hiding in plain sight all along. The solution, they say, is a well known mathematical measure called "mutual information", first described in 1948. It was initially used to quantify the amount of information that could be transmitted electronically through a telephone cable; the concept now underlies the design of the world's telecommunications infrastructure. "What we've found in our work is that this same concept can also be used to find patterns in data", Justin Kinney explained.

Applied to Big Data, mutual information is able to reveal patterns in large lists of numbers. For instance, it can be used to analyze patterns in data sets on the numerous bacterial species that help us digest food. "This particular tool is perfect for finding patterns in studies of the human microbiome, among many other things", Justin Kinney stated.

Importantly, mutual information provides a way of identifying all types of patterns within the data without reliance upon any prior assumptions. "Our work shows that mutual information very naturally solves this critical problem in statistics", Justin Kinney stated. "This beautiful mathematical concept has the potential to greatly benefit modern data analysis, in biology and many other important fields."

The research described was supported by the Simons Center for Quantitative Biology at Cold Spring Harbor Laboratory.

"Equitability, mutual information, and the maximal information coefficient" appeared on-line inPNASon February 17, 2014. The authors are Justin Block Kinney and Gurinder Singh Atwal. The paper can be obtained on-line at http://www.pnas.org/content/early/2014/02/14/1309933111.abstract
Source: Cold Spring Harbor Laboratory

Back to Table of contents

Primeur weekly 2014-02-24

Special

H2020: the long road to an integrated open and accessible European e-Infrastructure ...

PRACE, HPC applications and technological development: three ingredients for a top European strategy ...

Yannick Legré is the new director of EGI.eu ...

The Cloud

Red Hat Enteprise Linux OpenStack Platform leveraged by Alcatel-Lucent, CloudBand as part of its Network Functions Virtualization (NFV) Platform ...

AT&T and IBM join forces to deliver new innovations for the Internet of Things ...

Mellanox introduces CloudX Platform to enable companies to build the most efficient public, private and hybrid Clouds ...

EuroFlash

Powerful supercomputer to offer a glimpse of the early universe ...

From a distance: New technique for repair work ...

ECRIN-ERIC to host inauguration ceremony ...

Karlsruhe Institute of Technology to develop ultra-small and ultra–fast electro-optic modulator ...

SURFsara to host Data & Computing Infrastructure Event on 12-13 March 2014 ...

USFlash

SDSC team develops multi-scale simulation software for chemistry research ...

SDSC/UC San Diego researchers hone in on Alzheimer's disease ...

Intel advances next phase of Big Data intelligence: real time analytics ...

Supercomputer dramatically accelerates rapid genome analysis ...

Using computers to speed up drug discovery ...

Better cache management could improve chip performance and cut energy use ...

A step closer to a photonic future ...

HP delivers record-breaking performance and dramatic efficiencies with HP ProLiant servers ...

Researchers propose a better way to make sense of 'Big Data' ...

Mega-bucks from Russia seed development of 'Big Data' tools ...

A new laser for a faster Internet ...

C-DAC to organize Accelerating Biology 2014: Computing Life ...

NetApp introduces unified scale-out storage systems and virtualization software for the unbound Cloud era ...

Supermicro shipping 96 DIMM 4U 4-Way SuperServer featuring new Intel Xeon processor E7-8800/4800 v2 ...