"New workloads are driving the need for modern foundational architectures, and the recently launched Weka AI offers a transformative solution framework for Accelerated DataOps", stated Shailesh Manjrekar, head of AI and strategic alliances at Weka. "Our partnership with Valohai and our integration with its Deep Learning Pipeline Management tools expand Weka AI's capabilities to offer Explainable AI (XAI). This is a critical factor for use cases with a social impact, including autonomous driving, health care, and genomics."
Integration with solutions from technology alliance partners such as Valohai enhances the power of Weka AI. Underpinned by the Weka File System (WekaFS), Weka AI now provides a production-ready solution where the entire AI data pipeline workflow, from data ingestion to batch feature extraction, training, hyperparameter optimization, and finally to inference and versioning, can run on the same storage platform, whether on-premises or on AWS. This is only possible because of the excellent mixed workload performance and the data management and governance abilities provided by WekaFS.
"Machine Learning (ML) gives businesses a competitive advantage, but while ML is hard, real-world ML is much harder", stated Eero Laaksonen, chief executive officer at Valohai. "A real-world ML system is 95% enabling code, with only 5% actual ML code that creates business value, so the question becomes how to ensure that efforts are focused on that 5%. With Valohai, businesses can focus on data science, as it handles everything else. It is as simple as pointing to your code and data and hitting 'run'. The seamless integration between our DLMS solution and Wekas powerful snap-to-object capabilities offer a quick, zero-set-up infrastructure for DataOps. This further helps businesses to build models 10x faster, regain 35% of lost cloud costs, and free their DataOps teams with automated ML orchestration, data management, and data mobility. It is a win-win."
Weka AI, with Valohai DLMS, enables hybrid workflows, where data scientists can employ Jupyter Notebook or the Valohai GUI, either on-premise or on AWS, to do data transformation, model training, hyperparameter optimization, and inference in a Kubernetes-orchestrated environment. Valohai DLMS seamlessly integrates with WekaFS, which leverages i3.n Amazon Elastic Compute Cloud (Amazon EC2) instances with Non-Volatile Memory Express (NVMe) flash for the performance tier and extends the file namespace over Amazon Simple Storage Service (Amazon S3) buckets for the capacity tier.
Valohai DLMS also takes file namespace snapshots when running experiments and stores them for data versioning. These versions can then be rehydrated anytime to reproduce an experiment and provides the required explainability and transparency. Weka AI provides security and governance with end-to-end encryption of the pipeline and also provides integration with leading key management solutions such as, HashiCorp Vault.
Shailesh Manjrekar added: "Weka AI, when deployed with Valohai on AWS, builds upon our success for accelerating Genomics, Fintech, and AI/ML/DL data pipelines. Customers use Weka on AWS for outstanding performance as we can showcase 100GB/sec of throughput with 5 million 4KB IOPS for a 16xi3en cluster, all with less than 250-microsecond latency."