OpenAI partners with Broadcom to build 10-gigawatt AI chip infrastructure, adding to Nvidia and AMD deals

OpenAI’s ambition to control its own compute destiny just kicked into a higher gear. The maker of the popular ChatGPT announced Monday that it’s officially partnering with Broadcom to build and deploy 10 gigawatts of custom artificial intelligence accelerators, a massive leap in its campaign to secure the infrastructure needed for its next generation of AI models.
Broadcom’s stock jumped 9% on the news, reflecting Wall Street’s growing faith in the chipmaker’s position at the center of the AI hardware race. The companies didn’t disclose financial details, but the scale of the collaboration signals something much larger than a supply deal—it’s a sign that OpenAI is starting to think like a hardware company.
From GPUs to Custom Silicon: OpenAI Taps Broadcom for 10 GW of AI Compute Power
OpenAI and Broadcom have been working quietly together for about 18 months, but this is the first time they’ve gone public about their plans. Starting late next year, the two will begin rolling out racks of OpenAI-designed chips built on Broadcom’s Ethernet networking stack. These systems combine compute, memory, and networking components customized for OpenAI’s training and inference workloads.
“Partnering with Broadcom is a critical step in building the infrastructure needed to unlock AI’s potential and deliver real benefits for people and businesses,” OpenAI co-founder and CEO Sam Altman said in a statement. “Developing our own accelerators adds to the broader ecosystem of partners all building the capacity required to push the frontier of AI to provide benefits to all humanity.”
By building its own chips, OpenAI aims to lower compute costs and get more value out of its growing data center footprint. Industry estimates put the cost of a 1-gigawatt data center at roughly $50 billion, with about $35 billion going toward chips, based on Nvidia’s current pricing. If OpenAI’s math is right, developing custom silicon could save billions while improving performance.
“These things have gotten so complex you need the whole thing,” OpenAI CEO Sam Altman said in a podcast released alongside the announcement. “We can get huge efficiency gains, and that will lead to much better performance, faster models, cheaper models — all of that.”
Broadcom has quietly become one of the biggest beneficiaries of the AI boom. Its custom chips, known as XPUs, are already being snapped up by tech giants including Google, Meta, and ByteDance. The company’s market cap has surged past $1.5 trillion after more than doubling in 2024, cementing its position as a key supplier for hyperscalers building out AI infrastructure.
OpenAI President Greg Brockman said the company used its own models to accelerate chip design and improve efficiency. “We’ve been able to get massive area reductions,” he explained. “You take components that humans have already optimized and just pour compute into it, and the model comes out with its own optimizations.”
Broadcom CEO Hock Tan called OpenAI “the company building the most advanced frontier models,” adding that compute demand continues to climb with every new generation. “If you do your own chips, you control your destiny,” he said.
Altman described the 10-gigawatt target as just the beginning. “Even though it’s vastly more than the world has today, we expect that very high-quality intelligence delivered very fast and at a very low price—the world will absorb it super fast and just find incredible new things to use it for,” he said.
OpenAI currently operates on just over 2 gigawatts of compute capacity, enough to scale ChatGPT and develop tools like Sora. But demand keeps rising. In the past three weeks alone, the company has announced roughly 33 gigawatts of compute commitments across new partnerships with Nvidia, Oracle, AMD, and now Broadcom.
Altman summed it up plainly: “If we had 30 gigawatts today with today’s quality of models, I think you would still saturate that relatively quickly in terms of what people would do.”
With every new deal, it’s becoming clear that OpenAI’s ambitions aren’t limited to building smarter models—they’re building the physical backbone for the next era of intelligence.
🚀 Want Your Story Featured?
Get in front of thousands of founders, investors, PE firms, tech executives, decision makers, and tech readers by submitting your story to TechStartups.com.
Get Featured