EnCharge AI raises $100M to develop energy-efficient AI chips for edge computing
![](https://techstartups.com/wp-content/uploads/2025/02/EnCharge-AI-Founders-960x519.jpg)
AI startup founded by Princeton professor and AI hardware veterans develops analog in-memory computing chips to reduce energy consumption and expand AI beyond cloud data centers.
EnCharge AI has raised $100 million in a Series B funding round led by Tiger Global, with participation from Samsung Electronics’ VC arm, HH-CTBC (a partnership between Foxconn and CTBC Venture Capital), and RTX Ventures. The funding will enable EnCharge AI to bring its first AI accelerator solutions to market, tailored to meet partner-specific needs. The startup will also use the fresh capital infusion to roll out its first AI accelerator solutions, designed to meet the specific needs of its partners, EnCharge said in an announcement on Thursday.
Princeton-Backed Founders and Industry Expertise
EnCharge AI was co-founded by Naveen Verma, Kailash Gopalakrishnan, Ph.D., and Echere Iroaga, Ph.D. Verma, who serves as CEO, is a professor of electrical and computer engineering at Princeton University, where the foundational research for EnCharge’s chips was conducted.
Gopalakrishnan, Chief Product Officer, was an IBM Fellow and previously led IBM’s AI hardware and software efforts. Iroaga, Chief Operating Officer, brings over 25 years of semiconductor industry experience, with leadership roles at MACOM and Qualcomm.
The EnCharge AI team includes engineers and executives from NVIDIA, AMD, Waymo, Intel, Meta, SambaNova, and Cerebras, bringing deep expertise in AI semiconductor development.
Solving AI’s Growing Energy Problem
AI workloads, especially generative AI, are pushing energy consumption to unsustainable levels as companies rely on power-hungry data center clusters. EnCharge AI’s noise-resilient analog in-memory compute architecture drastically reduces the power requirements for running both traditional and generative AI inference workloads.
By integrating analog processing directly into memory, EnCharge’s AI accelerators consume up to 20 times less energy than leading AI chips available today across a wide range of applications. This efficiency breakthrough makes AI more viable beyond the cloud—bringing AI processing directly to edge devices like laptops, smartphones, and even defense and aerospace applications.
Expanding AI Beyond Data Centers
EnCharge AI’s technology is supported by a comprehensive software platform designed to maximize efficiency, performance, and accuracy, ensuring seamless AI model deployment on its hardware. The ability to run AI workloads within tight power constraints is a game-changer for industries with size, weight, and power limitations, such as defense and aerospace.
“The efficiency breakthrough of EnCharge AI’s analog in-memory architecture can be transformative for defense and aerospace use cases where size, weight, and power constraints limit how AI is deployed today,” said Dan Ateya, President and Managing Director of RTX Ventures. “Continuing our collaboration with EnCharge AI will help enable AI advancements in environments that were previously inaccessible given the limitations of current processor technology.”
Samsung Ventures also highlighted the startup’s extensive research foundation, emphasizing its ability to bring advanced AI beyond the cloud and onto consumer devices.
“EnCharge has achieved something revolutionary while having comprehensively derisked their technology through research at Princeton before the company was even launched,” said a Managing Director at Samsung Ventures. “Building on multiple generations of chips encompassing seven years of peer-reviewed research, Naveen and his team are ready to commercialize a complete hardware and software solution that can bring advanced AI out of the cloud and onto consumer devices.”
As AI adoption continues to surge, EnCharge AI is positioning itself as a key player in energy-efficient AI inference, bridging the gap between high-performance computing and sustainable AI deployment across industries.