Sensetics raises $1.75M in funding to digitize touch sensing and bring haptics into the AI era
Touch has always been the missing sense in human-machine interaction. Vision and audio were digitized decades ago, but touch has stayed locked in the physical world. Sensetics wants to change that. The Berkeley- and Virginia Tech-rooted AI startup just raised $1.75 million in pre-seed funding to push touch into the same league as sight and sound — a data stream that can be captured, edited, transmitted, and used to guide intelligent systems with far higher awareness of the physical environment.
The round, raised in the spring and co-led by MetaVC Partners and Fitz Gate Ventures with participation from Blue Sky Capital and AIC Ventures, gives Sensetics the runway to advance a platform that blends programmable fabrics with AI-driven tools for recording and replaying tactile experiences. It’s an attempt to create a new data layer for physical AI, one that can feed robotics, wearables, surgical tools, and remote systems with fingertip-level tactile detail.
A new digital sense: AI startup Sensetics raises $1.75M to bring human-level touch to machines
Sensetics’ Touch Signature™ fabrics and software record touch with high fidelity, letting users capture the nuances of pressure, texture, and motion the same way cameras capture light or microphones capture sound. The technology imitates the way mechanoreceptors in human fingertips work, producing tactile data streams that can be transmitted in real time from devices such as robotic arms, medical instruments, VR/AR controllers, or sensors embedded in industrial tools. The platform is built for low latency and durability, which matters for environments where touch changes constantly — warehouses, manufacturing lines, clinics, and simulation platforms.
The company sees a market that could exceed $10 billion across sectors that depend on precise force sensing and haptic feedback. Transportation and logistics, advanced manufacturing, medical robotics, and VR/AR training systems stand out as early targets. The pitch is simple: digitize touch and the physical world becomes far more accessible to machines. Doing that would create an entirely new class of physical data, the same way digital audio and video paved the way for streaming platforms, editing tools, and new forms of media.
The founders bring deep academic and entrepreneurial experience. CEO Adam B. Hopkins previously led Uniformity Labs and has a track record in advanced materials and engineering. CTO Rayne Zheng is a professor at UC Berkeley and Director of the Berkeley Sensors & Actuators Center, with research published in journals like Nature and Science on materials, robotics, and AI.
Hopkins believes the appetite for better touch interfaces is rising quickly across healthcare, industrial automation, aerospace, defense, and robotics. “We see a pivotal moment at the intersection of human-machine interaction and touch technology,” he said. “Demand for haptic controllers and high-resolution tactile sensors is surging in healthcare, industrial, aerospace, defense and robotics applications. Our mission is to make touch the next digital sense and to build a data platform for physical AI comparable in scale and importance to computer vision.”
Investors see the same shift. “We’re excited to back Sensetics as it pioneers the next frontier of human-machine interaction,” said Chris Alliegro, Managing Partner at MetaVC Partners. “Digitizing touch is an extraordinarily complex challenge—one that Sensetics is solving through an elegant application of mechanical metamaterials. By unlocking high-fidelity tactile sensing and feedback, Sensetics is laying the foundation for a new generation of intelligent machines capable of perceiving and responding to the physical world with superhuman sensitivity and precision.”
Mark Poag, GP at Fitz Gate Ventures, said the company aligns with their focus on deep tech with clear commercial potential. “Sensetics squarely fits our investment thesis, because it is a deep tech company utilizing proprietary IP to create an entirely new industrial category that targets a massive market opportunity, and it has an outstanding founding team.”
The company’s technology gives human operators — and machines — the ability to feel what remote systems are touching. A surgeon guiding a robotic tool, a technician controlling remote equipment, or a warehouse robot sorting fragile materials could all experience touch with far greater clarity and responsiveness.
Sensetics positions itself as a foundational player in the next phase of human-machine interaction, one where tactile data becomes as universal as audio and video signals. If the company succeeds, touch may soon be fully digital — captured, edited, streamed, and used to guide intelligent systems with a sense that has been missing from computing since the beginning.


