Enfabrica, an AI chip startup founded by Google and Broadcom execs, raises $125M with backing from Nvidia
The artificial intelligence (AI) funding boom that started at the beginning of this year following the sudden success of OpenAI’s ChatGPT continues without any signs of slowing down. The latest to ride the funding wave is Enfabrica, a Silicon Valley chip startup founded by executives from Alphabet’s Google and Broadcom.
Today, Enfabrica announced it has raised a $125 million Series B venture funding round led by Atreides Management with chip leader Nvidia joining as a strategic investor. Enfabrica also added new investors including IAG Capital Partners, Liberty Global Ventures, Valor Equity Partners, Infinitum Partners, and Alumni Ventures, with earlier investor Sutter Hill Ventures also joining the round.
In conjunction with the funding, Enfabrica also announced that Atreides Management founder Gavin Baker, a veteran of Fidelity Investments, is joining Enfabrica’s board of directors.
The news comes at a challenging time when many chip companies are struggling to raise capital. Just last week, we wrote about d-Matrix, another AI chip startup, that recently raised $110M in funding with backing from Microsoft to help power generative AI applications like ChatGPT.
Founded by CEO Rochan Sankar and Shrijeet Mukherjee, Enfabrica is working on networking chips for AI data centers. The startup has developed a groundbreaking network chip designed to revolutionize the way data centers operate. Its AI chip serves as a bridge, connecting various components of a data center in a novel manner to address existing challenges.
Enfabrica chip also establishes a network structure reminiscent of a hub-and-spokes configuration. This unique design enables Nvidia GPUs, responsible for data processing tasks, to seamlessly retrieve data from multiple sources without encountering any performance bottlenecks.
Enfabrica is part of a larger movement aimed at completely reimagining data centers to support the development of generative AI technologies, much like ChatGPT. At the core of this transformation are chips from Nvidia, the world’s leading semiconductor company. However, there’s a challenge with Nvidia’s graphics processing unit (GPU) chips—they occasionally remain inactive because the networks connecting them struggle to deliver data quickly enough.
In a statement, Sankar explained that this results in significantly more efficient utilization of GPUs. As a result, the same amount of computing work can be accomplished with approximately half the number of chips, as they are consistently engaged in tasks. In the technology sector, achieving such efficiency is widely regarded as advantageous since busier chips translate to higher cost-effectiveness.
“It’s no secret to Nvidia or anybody else out there that in order for AI computing to become truly ubiquitous, the cost curve has to come down,” Sankar said. “The key here is that we enable those GPUs to be better utilized.”
In recent months, investors have shied away from backing emerging chip startup companies primarily driven by Nvidia’s dominant position in the AI chip market bolstered by its formidable synergy between hardware and software offerings.
But that has not stopped investors from pouring billions into the sector. In the second quarter alone, investment in generative AI startups soared to an unprecedented level, surpassing a record-breaking $14.1 billion in equity funding across 86 deals.