At the Supercomputing Conference (SC25), DDN, a global leader in AI and data-intensive technologies, introduced DDN CORE, a unified data engine designed to support the world’s most demanding AI and HPC workloads. With this major launch, DDN aims to address one of the most pressing limitations in modern AI systems: data bottlenecks.
“The bottleneck in AI isn’t compute anymore it’s data. DDN CORE gives organizations a single data foundation where HPC and AI operate together at full speed and scale,” said Alex Bouzari, CEO and Co-Founder at DDN.
AI Authority Trend: DDN Launches Enterprise AI HyperPOD to Redefine Scalable AI Infrastructure
For decades, DDN has powered some of the fastest AI supercomputers and world-class research institutions. Now, with DDN CORE, the company extends this capability to fuel the emerging AI Factory era. The platform consolidates HPC and AI data flows into one intelligent, high-speed environment that ensures GPUs operate at maximum efficiency.
“The bottleneck in AI isn’t compute anymore it’s data,” Bouzari reiterated. “DDN CORE gives organizations a single data foundation where HPC and AI operate together at full speed and scale. It’s how we turn infrastructure cost into intelligence ROI.”
Sven Oehme, CTO at DDN, reinforced this point by adding, “DDN CORE was engineered to eliminate idle GPUs. By combining our expertise in parallel data systems with new intelligence-driven automation, CORE removes I/O latency, streamlines orchestration, and keeps every GPU working not waiting.”
The Growing Problem: Powerful Compute, Slow Data
Enterprises and research institutions continue investing more than $180 billion annually in AI infrastructure. Yet many organizations admit their environments remain too fragmented and inefficient. Slow data preparation, inference bottlenecks, and complex pipelines often leave expensive GPUs sitting idle. Meanwhile, global data-center power consumption is projected to double to 1,000 TWh, equivalent to the annual electricity use of the United Kingdom.
DDN CORE directly addresses these challenges. Instead of operating separate HPC and AI systems, organizations can rely on one software-defined engine that moves data as quickly as GPUs can process it turning every watt of energy into productive work.
A New Data Engine for Every AI Workload
DDN CORE combines DDN’s EXAScaler and Infinia technologies into one scalable data fabric. This architecture supports the full AI lifecycle, including simulation, training, inference, and retrieval-augmented generation (RAG). Rather than functioning as a routine storage upgrade, CORE serves as an intelligent performance engine.
AI Authority Trend: DDN and Yotta Power Sovereign AI in India with EXAScaler Deployment
Key performance capabilities include:
- Unified Data Plane: Parallel throughput and consistency across hybrid and sovereign deployments.
- Training Acceleration: Up to 15× faster checkpointing and 4× faster model loading, enabling more than 99% GPU utilization.
- Inference & RAG Optimization: Integrated caching and token reuse achieving 25× faster responses and 60% lower query costs.
- Extreme Power Efficiency: Up to 11× better performance-per-watt with 40% reduced power consumption.
- Autonomous Operations: Self-tuning capabilities through DDN Insight for continuous optimization.
Setting the Foundation for the AI Factory
DDN CORE operates as a software-defined intelligence layer that unifies performance, observability, and orchestration across multiple architectures. It runs natively on DDN’s AI400X3 and Infinia platforms and on certified systems from leading partners, including Supermicro and major cloud providers.
At SC25, DDN showcased its next-generation lineup:
- AI400X3 Series: Achieves up to 140 GB/s read speeds and 4 million IOPS in a compact 2U form factor.
- AI2200 (Infinia): Optimized for inference and RAG with double throughput and improved energy efficiency.
- Flexible Deployment: Available on-premises or in the cloud with consistent performance across environments.
Integrated with Top AI Ecosystems
DDN CORE is optimized for the NVIDIA AI Data Platform and validated across NVIDIA architectures such as GB200 NVL72 and BlueField DPUs.
Industry partners highlighted CORE’s impact:
“AI-ready storage is no longer optional it’s foundational” said Justin Boitano, Vice President, Enterprise AI Products, NVIDIA.
“By combining the scale of GCP with the performance of DDN CORE, we’re unlocking new levels of throughput” added Sameet Agarwal, VP Engineering, Google Cloud.
Sachin Menon, VP Cloud Engineering at Oracle, stated, “DDN’s GPU-optimized storage technology gives customers a cloud-native platform purpose-built for AI.”
Proven at Global Scale
DDN continues demonstrating results across more than a million GPUs worldwide:
- Yotta Shakti Cloud (India): Achieved 99% GPU utilization and 40% lower power usage.
- CINECA & Helmholtz Munich: Delivered 15× faster checkpointing with unified HPC and AI pipelines.
- Guardant Health: Cut data processing time by 70% and compute costs by 40%.
- SK Telecom Petasus Cloud: Enabled real-time inference with DDN Infinia.
With DDN CORE, the company aims to redefine enterprise-scale AI performance and transform how organizations manage data in the era of AI factories.
AI Authority Trend: DDN Infinia Now Available in Oracle Cloud Marketplace
To share your insights, please write to us at info@intentamplify.com



