Karman Industries has officially introduced its Heat Processing Unit (HPU), a modular 10MW integrated thermal platform designed to address the growing “speed-to-power” challenge confronting AI hyperscalers worldwide. As AI infrastructure scales at an unprecedented pace, hyperscalers increasingly struggle to deploy power and cooling fast enough to meet demand. With this launch, Karman positions its HPU as a transformative solution that consolidates complex heat management systems into compact, high-density modules enabling rapid deployment while completely eliminating water usage.

Moreover, the HPU does more than cool. By optimizing energy consumption and enabling efficient heat reuse, the platform allows operators to convert waste heat into usable power or district heating. As a result, data centers can significantly improve overall efficiency while unlocking new value streams from thermal byproducts.

AI Authority TrendCadence Launches First LPDDR5X 9600Mbps Memory IP System for Enterprise Data Centers

Alongside the product announcement, Karman revealed it closed a $20 million Series A round in September 2025, led by Riot Ventures. The funding round brings the company’s total capital raised to more than $30 million. Additional investors include Sunflower Capital, Space VC, Wonder Ventures, and former Intel and VMware CEO Pat Gelsinger, underscoring strong confidence in Karman’s long-term vision.

From Cooling Bottleneck to Thermal Advantage

As AI clusters rapidly move toward multi-gigawatt scale using advanced chip architectures from industry leaders such as Nvidia, heat has evolved from a limitation into a strategic asset. Traditionally, hyperscale data centers rely on sprawling mechanical yards filled with hundreds of chillers, dry coolers, and miles of piping. These legacy systems demand extensive land, long construction timelines, and massive energy input simply to remove heat.

In contrast, Karman has spent the past 18 months developing the HPU to fundamentally rethink this approach. Instead of fighting heat, the HPU processes and repurposes it.

“In the race to stand up AI capacity, time is the most expensive variable,” said David Tearse, CEO and Co-Founder of Karman Industries. “We’ve moved beyond the era of legacy chillers to HPUs. By shrinking the footprint of the mechanical yard by 80%, we don’t just save land; we eliminate the ‘snowball effect’ of infrastructure complexity, allowing hyperscalers to move from ‘shovels in the ground’ to ‘chips in the rack’ many months faster while unlocking additional compute.”

One Integrated System, Global Flexibility

Notably, Karman’s HPU architecture adapts seamlessly to gigascale AI factories across diverse climates. In hot environments, HPUs deliver highly efficient cooling with zero water consumption, helping operators reclaim stranded compute capacity. Meanwhile, for systems requiring lower-temperature cooling such as HBM4 HPUs convert high-temperature waste heat from advanced GPU platforms into cooling below 30ºC.

In colder regions, the system goes a step further by transforming waste heat into electricity or high-grade thermal energy suitable for district heating and other applications.

AI Authority TrendOpenAI and SoftBank Partner with SB Energy to Build Next-Gen AI Data Centers

“We applied an aerospace systems-engineering approach to data center thermodynamics,” said CJ Kalra, CTO and Co-Founder. “Our team designed HPUs to process the extreme heat of gigascale racks using a first principles based approach. HPUs enable heat reuse and PUE ratings quickly approaching 1 without water consumption or PFAS chemicals for AI factories. Keeping up with the latest in technology, the HPU leverages 800V DC architecture while borrowing the latest innovations in rocket turbomachinery like metal 3D-printing, and electric vehicles innovations like high-speed motors & Silicon-Carbide power electronics technology.”

Built for the Gigawatt Future

Key advantages of the Karman HPU platform include modular 10MW scalability, a 60% to 80% reduction in mechanical yard footprint, zero water consumption, and optimized thermal efficiency that maximizes AI return on investment.

“Thermal has become one of the most serious constraints facing data center infrastructure,” said Will Coffield, Co-Founder and General Partner at Riot Ventures. “Karman is bringing an elite engineering team to bear on the thermal sector. Their HPU architecture is unleashing this AI infrastructure supercycle from its greatest physical bottleneck.”

Looking ahead, Karman plans to begin initial customer deliveries in Q3 2026 from its Los Angeles-based GigaWerx manufacturing facility. With an initial capacity of 1GW annually and plans to scale to 4GW the company aims to meet the rapidly accelerating global demand for next-generation AI factories.

AI Authority TrendMarvell Expands AI Data Center Leadership with XConn Acquisition

To share your insights, please write to us at info@intentamplify.com