AI chip startup d-Matrix raises $110M in funding with backing from Microsoft to help power generative AI applications like ChatGPT
d-Matrix, a Silicon Valley-based artificial intelligence chip startup, has raised $110 million in funding with support from notable investors, including tech giant Microsoft Corp. d-Matrix develops next-generation chips that power generative AI applications and large language models such as ChatGPT.
The Series B funding round was led by Temasek, a Singapore-based investment firm, and included participation from Playground Global, a venture firm headquartered in Palo Alto, California, as well as technology giant Microsoft. d-Matrix had previously raised $44 million in funding but declined to disclose its current valuation.
The news of the funding comes at a challenging time when many chip companies are struggling to raise capital. In recent months, investors have shied away from backing emerging chip startup companies primarily driven by Nvidia’s dominant position in the AI chip market bolstered by its formidable synergy between hardware and software offerings, Reuters reported, citing multiple sources.
In a statement, CEO Sid Sheth told Reuters: “This is capital that understands what it takes to build a semiconductor business. They’ve done it in the past. This is capital that can stay with us for the long term.” He also added that the fundraising process for the company started roughly a year ago.
Founded in 2019 by Sid Sheth and Chaloem Khompitoon, d-Matrix develops chips that are optimized to enhance the performance of generative AI applications. The company’s overarching objective is to develop and deploy the inaugural AI compute engine inspired by the human brain, catering to a diverse range of inferencing tasks across the expansive cloud and infrastructure edge markets, which together represent a multi-billion-dollar industry.
These chips incorporate cutting-edge digital “in-memory compute” technology, significantly enhancing the efficiency of AI code execution. Notably, d-Matrix’s chip technology stands out by minimizing energy consumption while processing the substantial data required for generating AI responses, with a particular focus on optimization for such tasks.
What sets d-Matrix apart from industry giant Nvidia is its strategic emphasis on the “inference” aspect of AI processing, rather than competing directly in the realm of training large AI models.
“We have solved the computer architecture,” Playground partner Sasha Ostojic said. “We have solved the low power requirements and the needs of a data center – (we) built a software stack to deliver the lowest latency in the industry by orders of magnitude.”
Microsoft has expressed a commitment to assess the chip’s suitability for its own applications upon its anticipated launch next year, as confirmed by Sheth in a statement to Reuters.
Sheth added that the d-Matrix anticipates generating just under $10 million in revenue for the current year, primarily from chip sales for evaluation purposes. The company envisions a substantial growth trajectory, with projected annual revenues of $70 million to $75 million within two years, while aiming to reach the break-even point.