Reflection AI in talks to raise $1 billion in funding to take on Meta, DeepSeek in Open Source AI race

Reflection AI is barely a year old, but it’s already chasing one of the largest funding rounds in open‑source AI history. The New York startup, co‑founded by former Google DeepMind researchers Misha Laskin and Ioannis Antonoglou, is in talks to raise more than $1 billion to develop large language models that can go toe‑to‑toe with China’s DeepSeek, France’s Mistral AI, and U.S.-based Meta Platforms, according to The Information. One person familiar with the deal told the outlet the company has “raised most of its target.”
Some of that cash will cover the staggering compute costs needed to train new AI models. In internal discussions, Laskin and Antonoglou told employees they believe “there is an opportunity to establish Reflection AI as the preeminent U.S.-based provider of open‑source AI models,” The Information reported. They see momentum from DeepSeek’s rise as proof that open‑source approaches can challenge the dominance of proprietary systems.
“Reflection AI, a one-year-old startup co-founded by former Google DeepMind researchers, is in talks to raise more than $1 billion to fund its efforts to develop open-source large language models to compete with the likes of China’s DeepSeek, France’s Mistral and U.S-based Meta Platforms, according to two people with direct knowledge of the matter. The company has raised most of its target, according to one of those people.”
The billion‑dollar push comes after Reflection AI’s $130 million raise in March 2025, led by Sequoia Capital and SV Angel, which valued the company at about $500 million. This time, the funds will bankroll the development of an advanced AI agent named Asimov—an ambitious bid to inch closer to superintelligent, autonomous systems. Reflection’s funding target also reflects investor confidence in the startup’s potential to challenge established players and reshape the AI industry.
The Open-Source AI Revolution
Open‑source AI has been on a tear this year, driven by breakthroughs like DeepSeek’s R1 model, which shocked the industry in January. Trained for around $6 million, R1 matched the performance of GPT‑4 while requiring far less computational muscle than Meta’s Llama 3.1. Released under an MIT license, R1 quickly became a benchmark for cost‑efficient, open‑source development, prompting a rethink of how AI is built.
Meta, with its Llama series, has had its own validation. Over 800 million downloads later, its open‑source strategy looks prescient. Chief AI scientist Yann LeCun credited the collaborative model in January, saying DeepSeek’s team “came up with new ideas and built them on top of other people’s work. Because their work is published and open source, everyone can profit from it.” Reflection AI hopes to push that idea further, blending efficiency with performance.
In Europe, Mistral AI has staked out ground in data‑sensitive sectors like banking and defense. Valued at $6 billion after raising over $1 billion, Mistral isn’t competing directly with DeepSeek’s pricing war—but Reflection AI’s focus on superintelligent systems could change that.
Why Open-Source Matters
The open‑source route comes with perks—lower costs, transparency, and speed of iteration—but also risks. Cisco researchers recently exposed vulnerabilities in DeepSeek’s R1 that could be exploited through algorithmic jailbreaking. Any new entrant, especially one targeting enterprise adoption, will have to prove it can close those gaps.
For Reflection AI, much of the billion appears already in hand. The challenge now is delivering models that can match DeepSeek’s efficiency, Meta’s reach, and Mistral’s precision—while holding to the open‑source ethos that’s fueling this new AI race.
🚀 Want Your Story Featured?
Get in front of thousands of founders, investors, PE firms, tech executives, decision makers, and tech readers by submitting your story to TechStartups.com.
Get Featured