Today's global enterprises are awash with data. This data is generated by a variety of sources: Internet traffic, sensors, credit card activity, social media, video monitors, and more. In fact, data is arriving in volume, with velocity, and of variety far greater than ever experienced. Utilizing analytics to unlock value lying within Big Data has far reaching benefits, from competitive advantage and topline growth to saving lives. But it also presents new challenges: how to derive value at greater speed, scale, and efficiency.
"SGI has been enabling customers to achieve extraordinary breakthroughs utilizing High Performance Computing, as well as efficiently manage some of the largest data environments in the world, for nearly two decades", stated Jorge Titinger, president and CEO of SGI. "Today's announcement reflects our rich heritage in HPC and high volume storage, and our ability to significantly help enterprises accelerate time to value, achieve petabyte scale, and lower the rising cost of Big Data."
SGI's expertise in building the world's fastest supercomputers is proving to be invaluable in addressing these challenges. Supercomputing has historically been applied to complex, computationally intensive problems ranging from scientific discovery to physical simulations to government security. Now enterprises are applying High Performance Computing (HPC) to the rapidly emerging field of Big Data analytics, and when relationships within data sets are generally understood, turning to the open source computing framework Apache Hadoop.
Continuing its market leading innovation, SGI was at the forefront of Hadoop and has cluster installations now reaching tens of thousands of nodes. Building on this rich heritage, SGI is announcing a new factory-integrated platform that is optimized for Hadoop to perform Big Data analytics with faster and greater insight and reduce time to value from months to days.
"Big Data workloads require large amounts of compute power, usually deployed as large groups of scale-out Linux servers, in order to run analytics software such as Hadoop. However, the capabilities and IT skill-sets required to optimize these Hadoop analytics clusters are not present in many IT organisations", stated Jean S. Bozman, research vice president, IDC Enterprise Platforms group. "By pre-configuring Big Data analytics systems, SGI is addressing this important market need - and it is providing a readily deployed Big Data solution that will speed time-to-results from business analytics."
SGI is also building on expertise and innovation in HPC petascale environments to address a fundamental and, increasingly difficult, Big Data challenge - storing data volumes at massive scale, within tight budgets, while meeting demanding user access requirements. SGI ranks among the world's Top 10 storage vendors, shipping over 600PBs annually. SGI's announcements also encompass innovative solutions that deliver extreme storage capacity and scale needed for Big Data, and enable enterprises to significantly lower high-volume storage costs.
"Gaining insight from data represents both an enormous opportunity and challenge for enterprises, academia, industry and government", stated Rajeeb Hazra, vice president and general manager of Technical Computing Group at Intel. "SGI's Big Data solutions are an example how the latest technology enables users to both manage their data and quickly scale their applications to achieve insights that ultimately result in a significant competitive advantage."
As data grows in size and complexity, and the quest for greater and precise business insight expands, leveraging technology developed for extreme environments is a natural step for the enterprise market. SGI continues to make this possible.