AI

WEKA achieves NVIDIA DGX BasePOD Certification

With NVIDIA DGX BasePOD Certification Complete, Company Sets Its Sights on NVIDIA DGX SuperPOD Certification

WekaIO (WEKA), the data platform software provider for AI, announced today that it has received certification for an NVIDIA DGX BasePOD™ reference architecture built on NVIDIA DGX H100 systems and the WEKA Data® Platform. This rack-dense architecture delivers massive data storage throughput starting at 600GB/s and 22M IOPs in 8 rack units to optimize the DGX H100 systems.

The WEKA Data Platform provides the critical data infrastructure foundation required to support next-generation, performance-intensive workloads like generative AI model training and inference at scale. Its advanced, software-based architecture transforms stagnant data storage silos into dynamic data pipelines that help to more efficiently fuel data-starved GPUs and seamlessly power AI workloads on-premises, in the cloud, at the edge, or in hybrid and multicloud environments.

With its modular design, the DGX BasePOD offers the flexibility needed to scale resources according to evolving computational needs, providing cost-efficiency and streamlined management. Integrating NVIDIA H100 Tensor Core GPUs and NVIDIA InfiniBand networking technologies delivers enhanced performance for diverse AI workloads, fostering faster model training and deployment. The DGX BasePOD is designed to focus on agility, efficiency, and performance, making it a robust solution for organizations that want to optimize their data infrastructure investments while pushing the boundaries of AI innovation.

The WEKA with NVIDIA DGX BasePOD reference architecture solution seamlessly and efficiently delivers the performance enterprises customers need to accelerate AI adoption and achieve faster time to insights, discoveries, and outcomes.

Key benefits of the new WEKA with NVIDIA DGX BasePOD reference architecture include:

  • Extreme Performance for the Most Demanding AI Workloads: Delivers 10x the bandwidth and 6x more IOPs than the previous WEKA with NVIDIA DGX BasePOD configuration based on NVIDIA DGX A100 systems.
  • Best-of-Breed Compute: NVIDIA DGX systems feature powerful Intel® Xeon® processors, NVIDIA ConnectX-7 NICsNVIDIA Quantum-2 InfiniBand switches, and NVIDIA Spectrum Ethernet switches.
  • Optimal Efficiency: The rack-dense configuration delivers the performance needed to meet the needs of up to 16 DGX H100 systems in a space- and energy-efficient footprint — and is expected to support larger clusters of 32 or more DGX H100 systems.
  • Excellent Linear Scaling: Based on NVIDIA’s validation testing results for various demanding AI/ML workloads, WEKA’s integration with the architecture of the DGX BasePOD helps organizations start small and then quickly and independently scale up compute and storage resources from a single DGX to multi-rack configurations with predictable performance to flexibly meet workload requirements.
  • Turnkey Choice and Flexibility: Enterprise customers can use WEKA Data Platform software with DGX BasePOD, powered by the latest DGX systems, to drive technologies and gain a time-to-market advantage.

WEKA’s DGX BasePOD certification advances its journey to DGX SuperPOD certification. WEKA was among the first companies to implement, qualify, and use NVIDIA GPUDirect® Storage (GDS), was one of the first DGX BasePOD-certified data stores in 2021, and assisted NVIDIA with scaling its networking architectures to expand its enterprise customer footprint.

“WEKA is proud to have achieved this important milestone with NVIDIA. With our DGX BasePOD certification completed, our DGX SuperPOD certification is now in progress,” said Nilesh Patel, chief product officer at WEKA. “With that will come an exciting new deployment option for WEKA Data Platform customers. Watch this space.”

“Enterprises everywhere are embracing AI to enrich customer experiences and drive better  business outcomes,” said Tony Paikeday, senior director of AI systems at NVIDIA. “With the NVIDIA DGX BasePOD certification, WEKA can help enterprises streamline their AI initiatives with optimized, high-performance infrastructure solutions that deliver data-fueled insights sooner.”

Previous ArticleNext Article