Nvidia invests $4B in Lumentum and Coherent to accelerate silicon photonics for AI infrastructure
Nvidia is opening its wallet again, this time to secure a key piece of the AI infrastructure stack that most people rarely see: photonics.
The chip giant said Monday it will invest a combined $4 billion across two U.S. photonics firms, committing $2 billion each to Lumentum and Coherent. The move signals Nvidia’s growing focus on the optical technologies needed to move massive amounts of data inside next-generation AI systems.
The announcement lands just days after Nvidia joined a blockbuster $110 billion financing for OpenAI alongside Amazon and SoftBank, a deal that valued the ChatGPT maker at $730 billion. Taken together, the back-to-back moves show Nvidia pushing aggressively beyond GPUs into the plumbing that keeps large AI deployments running.
Nvidia doubles down on AI infrastructure with $4B photonics investment
Lumentum builds optical and photonic components used in the networks behind AI, cloud computing, and advanced communications systems. Its technology helps data travel faster and more efficiently across the fiber links that connect modern data centers.
Coherent operates in a similar arena, developing photonics systems that use light to enable high-performance optical applications. Photonics, in simple terms, replaces or supplements traditional electrical signaling with light-based data transfer, a shift many in the industry see as necessary as AI workloads continue to surge.
Nvidia CEO Jensen Huang framed the investments as part of a longer-term push into silicon photonics, an area gaining attention across the AI infrastructure ecosystem.
“Together with Lumentum, NVIDIA is advancing the world’s most sophisticated silicon photonics to build the next generation of gigawatt-scale AI factories,” Huang said in a statement.
He added that Nvidia will work with Coherent on developing next-generation silicon photonics for AI infrastructure.
The strategy reflects a growing reality inside hyperscale computing: moving data has become just as critical as processing it. As AI models grow larger and clusters span thousands of GPUs, traditional electrical interconnects are reaching physical and energy limits. Optical links offer a path to move more data with lower power and latency.
For Nvidia, which already dominates the AI accelerator market, the investments look less like a side bet and more like supply chain positioning. Control over optical networking components could become increasingly valuable as cloud providers and enterprises race to build what Huang often calls “AI factories.”
The broader signal is clear. The AI arms race is no longer confined to model training and GPU performance. The companies that control the data pipelines inside future data centers may end up holding just as much leverage.

Courtesy: Nvidia

