"Apache Hadoop has the potential to transform business by allowing enterprises to harness very large amounts of data for competitive advantage", stated Jerry Chen, vice president, Cloud and Application Services, VMware. "It represents one dimension of a sweeping change that is taking place in applications, and enterprises are looking for ways to incorporate these new technologies into their portfolios. VMware is working with the Apache Hadoop community to allow enterprise IT to deploy and manage Hadoop easily in their virtual and Cloud environments."
Apache Hadoop is emerging as the de facto standard for big data processing, however, deployment and operational complexity, the need for dedicated hardware, and concerns about security and service level assurance prevent many enterprises from leveraging the power of Hadoop. By decoupling Apache Hadoop nodes from the underlying physical infrastructure, VMware can bring the benefits of Cloud infrastructure - rapid deployment, high-availability, optimal resource utilization, elasticity, and secure multi-tenancy - to Hadoop.
Available for free download under the Apache 2.0 license, Serengeti is a "one-click" deployment toolkit that allows enterprises to leverage the VMware vSphere platform to deploy a highly available Apache Hadoop cluster in minutes, including common Hadoop components like Apache Pig and Apache Hive. By using Serengeti to run Hadoop on VMware vSphere, enterprises can easily leverage the high-availability, fault tolerance and live migration capabilities of the world's most trusted, widely deployed virtualization platform to enable the availability and manageability of Hadoop clusters.
"Hadoop must become friendly with the technologies and practices of enterprise IT if it is to become a first-class citizen within enterprise IT infrastructure. The resource-intensive nature of large Big Data clusters make virtualization an important piece that Hadoop must accommodate", stated Tony Baer, Principal Analyst at OVUM. "VMware's involvement with the Apache Hadoop project and its new Serengeti Apache project are critical moves that could provide enterprises the flexibility that they will need when it comes to prototyping and deploying Hadoop."
VMware is working with the leading Apache Hadoop distribution vendors, including Cloudera, Greenplum, Hortonworks, IBM and MapR to support a wide range of distributions.
To further simplify and speed enterprise use of Apache Hadoop, VMware is working with the Apache Hadoop community to contribute changes to the Hadoop Distributed File System (HDFS) and Hadoop MapReduce projects to make them "virtualization-aware", so that data and compute jobs can be optimally distributed across a virtual infrastructure. These changes will enable enterprises to achieve a more elastic, secure and high available Hadoop cluster. The extensions can be found at https://issues.apache.org/jira/browse/HADOOP-8468 .
VMware has also made updates to Spring for Apache Hadoop, an open source project first launched in February of 2012 to make it easy for enterprise developers to build distributed processing solutions with Apache Hadoop. These updates allow Spring developers to easily build enterprise applications that integrate with the HBase database, the Cascading library, and Hadoop security. Spring for Apache Hadoop is free to downloadand available now under the open source Apache 2.0 license.
Together, these projects and contributions are designed to help accelerate Hadoop adoption and enable enterprises to leverage big data analytics applications, such as Cetas, to obtain real-time, intelligent insight into large quantities of data. VMware acquired Cetas in April 2012.