AI chip startup Cerebras seeks up to $4.8 billion in upsized U.S. IPO
Artificial intelligence infrastructure startup Cerebras Systems is heading into one of the year’s biggest tech IPOs, with fresh momentum from Wall Street and growing demand for AI computing.
The company raised the expected price range for its initial public offering on Monday, now aiming to sell shares between $150 and $160 each, up from the earlier range of $115 to $125. At the high end, Cerebras could raise up to $4.8 billion in the offering and reach a fully diluted valuation of nearly $49 billion.
“Cerebras Systems Inc. increased the size of its initial public offering, now seeking to raise as much as $4.8 billion, as demand for the artificial intelligence chipmaker and data center operator’s shares continues to build,” Bloomberg reported.
The updated filing lands less than a week after CNBC reported that Cerebras was preparing to raise up to $3.5 billion in a Nasdaq listing. Investor appetite appears to have strengthened quickly since then, pushing the company to increase both the pricing range and the size of the opportunity ahead of its debut.
Founded in 2015, Cerebras has spent years trying to challenge NVIDIA’s dominance in AI computing with a very different approach: building a single massive chip rather than connecting thousands of smaller processors.
Cerebras Increases IPO Price Range Amid Surging Demand for AI Infrastructure
Its flagship Wafer-Scale Engine has become one of the more unusual products in the AI hardware race. The latest version, WSE-3, is roughly the size of a dinner plate and contains four trillion transistors packed across nearly an entire silicon wafer. Cerebras says the chip includes nearly 900,000 AI-optimized cores, capable of delivering up to 125 petaflops of performance.
The pitch to customers is straightforward. Large AI models often run on giant GPU clusters that require substantial networking and coordination between chips. Cerebras says putting compute, memory, and interconnects on a single chip reduces bottlenecks that slow training and inference in advanced AI systems.
That message has resonated with companies racing to build larger generative AI models.
OpenAI has reportedly committed more than $20 billion to projects involving Cerebras infrastructure. The startup has gained traction in AI coding workloads and high-performance inference systems, areas where speed has become a major selling point as AI companies compete to reduce costs and latency.
The company has gradually shifted away from a pure hardware sales strategy. Instead of mainly selling chips, Cerebras has been building cloud infrastructure powered by its own processors and offering customers direct access to computing capacity. That move puts it into more direct competition with cloud providers.
In March, Amazon Web Services announced a partnership to bring Cerebras chips into its data centers, giving the startup another high-profile validation as cloud providers look for alternatives to Nvidia hardware.
Cerebras has surfaced in another major tech story recently: the courtroom battle between Elon Musk and Sam Altman over OpenAI’s future.
During testimony last week, OpenAI co-founder Greg Brockman said Cerebras represented “the compute we thought we were going to need,” adding that OpenAI at one point discussed a possible merger with the chip startup and that Musk supported the idea.
The company’s financial picture has improved alongside the AI boom. Cerebras reported fourth-quarter revenue of $510 million, up 76% from a year earlier, along with $87.9 million in net income, according to CNBC. Those numbers mark a major shift for a business long viewed as an expensive hardware bet with uncertain margins.
Nasdaq expects the IPO to take place on May 14. If demand holds, Cerebras could become one of the most closely watched AI listings since the generative AI boom reshaped Silicon Valley and sent investors pouring into companies tied to the infrastructure behind models like ChatGPT.

Cerebras CEO

