BrainChip raises $25M to scale Neuromorphic AI chips and on-device GenAI
BrainChip, the Australian chip company behind one of the earliest commercial neuromorphic AI processors, has raised $25 million to advance its technology toward production and real-world deployment. The funding will support continued work on its Akida chips, on-device generative AI models, and a broader lineup of modules for edge systems that require intelligence without cloud reliance.
The funding comes as neuromorphic computing gains fresh attention across defense, industrial systems, and embedded devices. Market research firm Grand View Research estimates the sector could reach $20.27 billion by 2030, growing at nearly a 20 percent annual pace over the remainder of the decade. BrainChip is positioning itself to capture a larger share of that opportunity by keeping AI close to the sensor and minimizing energy use.
At CES, the company plans to demonstrate how that strategy is taking shape in its product shipments. BrainChip will demonstrate AKD1500-based modules for industrial PCs and other environments where power budgets are tight, and connectivity cannot be assumed. The company is also highlighting always-on AI through Pico evaluations on Akida Cloud, along with cybersecurity use cases built with partner Quantum Ventura. One of the more eye-catching demos centers on running a 1.2-billion-parameter large language model directly on mobile and embedded devices.
“Our capital raise positions BrainChip to further build its lead in edge AI and neuromorphic computing,” said CEO Sean Hehir. “Investor support lets us advance Akida 2 chip development and Akida GenAI model development. We can expand into new commercial opportunities through chip and module products that provide real-time, on-device AI with ultra-low power and no cloud dependency. CES is the ideal stage to showcase our growth trajectory, our consistent groundbreaking innovations and our growing product portfolio.”
A core focus of the current lineup is the AKD1500, a compact chip built for sensors, medical devices, and wearables where cost and battery life shape every design decision. BrainChip says the device supports on-device language models through its TENNs architecture, enabling private, real-time generative AI without sending data off-device. These capabilities sit on top of the Akida 2 platform, which allows local learning and inference within constrained systems.
Partner demonstrations at CES are meant to show how that translates into deployed systems rather than lab experiments. HaiLa Technologies is presenting ultra-low-power Bluetooth and Wi-Fi integration with the AKD1500 for wearable vision tasks. Deep Perception is demonstrating a visual computing pipeline using the AKD1000 on drones and mobile devices. Quantum Ventura is running its Neuro RT cybersecurity model on the Akida Edge AI Box to show how small office networks can detect threats without cloud-based inspection.
BrainChip will host demos and meetings throughout CES from its suite in the Venetian Tower and is inviting attendees to book private sessions or attend its Wednesday mixer.
Founded on the principle that AI should run where data is created, BrainChip builds processors that use event-driven computation inspired by biological neurons. Its Akida processors analyze signals in real time rather than processing full data streams frame by frame, enabling systems to respond in real time while staying within tight energy budgets. That approach has attracted interest across industries, including aerospace, robotics, industrial IoT, and consumer electronics.
With fresh capital in hand and new demonstrations lined up, BrainChip is betting that demand for on-device intelligence will continue to shift away from centralized AI systems and toward chips that think locally, quietly, and efficiently.

