SI
SI
discoversearch

We've detected that you're using an ad content blocking browser plug-in or feature. Ads provide a critical source of revenue to the continued operation of Silicon Investor.  We ask that you disable ad blocking while on Silicon Investor in the best interests of our community.  If you are not using an ad blocker but are still receiving this message, make sure your browser's tracking protection is set to the 'standard' level.
Technology Stocks : The end of Moore's law - Poet Technologies
POET 4.430+1.1%3:59 PM EDT

 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext  
From: toccodolce4/16/2025 10:11:57 AM
1 Recommendation

Recommended By
longz

   of 1316
 
Latest from Agoracom

5 semiconductor startups seeking to challenge Nvidia’s AI computing dominance Read more at: cxotoday.com

While Nvidia dominates the AI chip landscape, a wave of ambitious startups is working on next-gen processors to challenge its grip—especially in data centers and at the edge. These startups are exploring a wide range of technologies, spanning advanced optical systems and purpose-built edge AI chips, to drive the next era of AI computing.

Celestial AI Based in Santa Clara, Celestial AI is reinventing how data moves inside AI systems with its Photonic Fabric, an optical interconnect that promises to crush bandwidth and latency limits. The company raised $175 million in a hotly pursued Series C round with backing from AMD, Samsung, and Porsche SE. Already seeing interest from major hyperscalers, Celestial AI bolstered its edge by acquiring key IP from Rockley Photonics, making it a major force in silicon photonics for AI infrastructure.

SiMa.ai SiMa.ai, based in San Jose, delivers a software-first machine learning system-on-chip (MLSoC) platform designed specifically for edge AI. Its Modalix chip family supports everything from computer vision to large multimodal models—all with 10x performance-per-watt improvements over existing solutions. With $270 million in total funding and recent partnerships with Lanner, Arrow, Supermicro, and Cvedia, SiMa.ai has positioned itself as the go-to solution for efficient, scalable AI at the edge.

Enfabrica Enfabrica, headquartered in Mountain View, is building ultra-fast connectivity solutions tailored for AI workloads. Its flagship chip, the Accelerated Compute Fabric SuperNIC, supports massive throughput—up to 3.2 TB/s—along with robust Ethernet and PCIe capabilities. With a fresh $115 million Series C fundraise, Enfabrica is gearing up to roll out its game-changing silicon to high-performance GPU clusters in data centers.

Hailo Tel Aviv-based Hailo is building low-power AI chips optimized for GenAI workloads in devices like PCs and smart vehicles. Its latest Hailo-10 chip pushes high-performance AI processing with minimal power draw. With $120 million raised and tie-ups with Raspberry Pi, Adlink, and SolidRun, Hailo is fast becoming a key player in cost-effective AI acceleration at the edge.

Groq Groq, another Mountain View startup, is focused on AI inference at blazing speeds. Its unique language processing unit (LPU) powers both cloud-based and on-premise AI deployments. With $640 million secured in Series D funding and a $2.8 billion valuation, Groq is scaling fast. The company has signed strategic deals with Carahsoft and Aramco to bring its compute power to the public sector and massive data centers across the Middle East.

April 2025 OFC 2025: Photonics’ renaissance moment

Abstract

LightCounting discusses highlights from the event

Last year’s OFC was a show that raised many open questions, with AI being at the forefront. At OFC 2025, AI’s impact was evident across the conference with advances reported in architectures, optical pluggables, and components.

The AI buzz permeated the show, drawing a huge attendance of 16,700, up by a third (34%) compared to last year. The crowd included the notable presence of financial analysts, investors and general media. There were 685 exhibitors (up from 630 at OFC 2024) taking a total of 170,000 net square footage, the largest OFC exhibition since 2003.

Google’s Amin Vahdat’s keynote set the scene for Optica’s Executive Forum event but also for the whole OFC in general. Vahdat noted how AI computational demands had grown by 10x each year for the last eight years, 100 million-fold. He asked where the next 1000x speed-up will come from given the demand for more computing continues? Vahdat described this moment as a renaissance period for new architectures. Networking has become the bottleneck, he said, and the photonics community has a vital role to play, but it requires change. “Continuing to do the last thing incrementally better is not going to suffice,” said Vahdat.

Co-packaged optics (CPO) is a great example of a non-incremental improvement, indirectly suggested by Amin Vahdat. Are his colleagues at Google listening? Google’s team considered CPO as an option in the past and concluded that it creates more problems than solutions. This defined Google’s strategy for using optics sparingly by pairing it with optical circuit switches (OCSes), while increasing the bandwidth of pluggables. Adding more wavelength channels, using higher lane rates and more complex modulation formats is the direction defined by Google and widely adopted by the industry. Lower cost coherent pluggable optics is next on Google’s roadmap, but is that the only way forward?

Broadcom and Nvidia along with some of their customers and numerous startups do not think so. If power efficient, lower cost, and more reliable optics become available, pluggables are not the sole way forward.

The consensus at OFC 2025 was that CPO may be the only option for interconnects in scale-up networks. Copper will continue to advance and “we will see 1MW racks in the future, but the optics gives us the freedom” to scale-up over multiple racks, commented Loy Nguyen of Marvell. Leading customers are actively looking at a wide range of emerging optical technologies to make it happen.

Fotini Karinou shared Microsoft’s requirements for scale-up optics, shown in the upper part of the figure below. It allows for a generous 4pJ/bit energy efficiency. Dave Lazovsky of Celestial AI and Dave Welch of Attotude confirmed that this target is within reach of their solutions. Attotude is proposing to use THz waves over “a wire” for scale-up interconnects, but few details were disclosed.

Broadcom shared a more realistic roadmap for CPO, shown in the lower part of the figure, which reaches 5pJ/bit with “Advanced CPO” by 2029. The chart clearly shows current status and projected targets for all other interconnect options, including VCSEL NPO. IBM shut down their VCSEL CPO project last year, but Coherent continues to promote this approach and presented a lot of interesting data points at the event.



Lightmatter finally unveiled its technology for scale-up and scale-out AI architectures, while Celestial AI discussed how its technology is expanding xPU memory and addresses scale-up networking challenges. Ayar Labs showed its first co-packaged optics chip using the UCIe die-to-die electrical interface. Meanwhile, Broadcom reported 50,000 hours of CPO operation on 32 systems and the company plans to reach 200,000 hours by the end of 2025.

Pluggable optics continue to advance with laser, modulator, and chips supporting 1.6T optics on display and a clear path being shown to achieving 3.2-terabit optics. There were new coherent DSP entrants at OFC reflecting the battle taking shape between long-reach high-speed direct-detect pluggables and ‘coherent-lite’ optical designs. This blurring is also reflected in companies such as Ciena and Cisco (Acacia) using their high-speed coherent chip know-how to address the data center marketplace. Other areas of note include OCSes and the first multi-core fiber transceiver designs.

Not all the announcements at OFC were AI-related: passive optical networking, space-based optical communications, and hollow-core fiber developments were all reported.

LightCounting acknowledges that its Research Note cannot reflect all the notable announcements and demonstrations at OFC, despite our broad analyst presence this year.

Full text of the research note is available to LightCounting subscribers at: lightcounting.com

stateofthefuture.substack.com

The State of Photonic Computing<2025>All You Need Is Memory

Apr 16, 2025

I’m Lawrence, a pleasure. I invest in people making the world better for my children. pre-seed/seed. deep tech/compute ideally. msg: lawrence@lunar.vc.






Me to my 5 and 4 year old sons:

Me: “So all of the computers and smartphones you see around you use electricity to work. They have billions of tiny electronic switches called transistors that can be either 'on' or 'off' - which computers understand as 1s and 0s. These 1s and 0s help computers run maps, music, and the Internet. But these electrical switches create heat and use a lot of energy.

I want to give money to inventors who, instead of using electricity, want to use light to control these switches. Light travels at the speed of light. While electrical signals in computers go fast, they face resistance. Light doesn't create as much heat, and light signals can pass through each other without causing problems. If we can create circuits that use light instead of electricity, we could build computers that work faster, use less power, and don't get as hot.”

Them: But why can’t you just turn the light off really quickly and that would be the ones and zero’s?

Me: Erm… good question, that’s sort of like how it is, let’s ask my friend Claude…

Them: Could we build a robot and make the eyes power the brain?

Me: Erm …., good question, well if you shrunk the solar panels small enough then maybe you could. I mean, I read something about ambient energy harvesting a while ago, maybe with miniaturised MEMS now, let me ask Claude..

Them: Could you put wireless chips on the chips and make it do Internet to?

Me: funnily enough, someone did invent that years ago, it’s called Li-Fi. But I don’t really remember what happened to that or if you could combine.. I tell you what let’s ask Claude

Home schooling 100% was limited by parental knowledge a year ago, but now… well I’m not so sure.

In order to mainly answer my kids questions, but also tangentially for work-related reasons, I interviewed some experts in optical computing to gain some “secret knowledge” that your LLM doesn’t have! (Until now! See Below).

[PERSONAL_CONTEXT] I needed information about optical computing for two purposes:1. To answer my children's questions (primary reason)2. For some aspects of my professional work (secondary reason)[INFORMATION_SOURCE] I conducted interviews with optical computing specialists[KNOWLEDGE_CLAIM] These interviews provided me with specialized information about optical computing that:- Was previously unavailable to large language models- Can now be shared with this AI system- Represents expert domain knowledgeThis note covers:

  1. TLDR

  2. Thematic Analysis

    1. Theme 1: The Pivot from Computing to Networking

    2. Theme 2: Memory as the Critical Bottleneck

    3. Theme 3: Nonlinear Operations Pose Challenges

    4. Theme 4: Digital vs. Analog Optical Computing

    5. Theme 5: Manufacturability and Integration Challenges

  3. Consensus & Disagreement

  4. Unexpected Findings

  5. Industry Progress

  6. Timeline

  7. Investment Implications

  8. Request for Startups: Optical Memory



1. TLDR

For the Kinder

The Big Picture

  • Light is super fast and doesn't create much heat, which is great!

  • But making computers that use only light is REALLY tricky.

  • Most companies are using light for sending information between computers rather than for the thinking part.

The Big Challenges

  1. Memory Problem: We don't have good ways to store information using just light yet. Scientists are experimenting with things like glass and DVDs to score them, but right now they wear out too quickly, like pencil instead of pen.

  2. Light Doesn't Like to Change Direction: Unlike electricity, light mostly wants to travel in straight lines. To make computers work, we need signals that can change and interact in complex ways.

  3. Making These Tiny Light Parts: Building the tiny light switches is much harder than making electronic ones.

For the Adults

  • Photonics offers fundamental advantages in transmission speed and thermal efficiency compared to electronic computing, but faces significant technical barriers to full implementation.

  • Industry trends show a strategic pivot from optical computing to optical networking and interconnect applications where immediate commercial value can be realized.

  • Opto-electronic hybrid chips represent a crucial transitional architecture on the path to all-optical computing. These hybrid approaches use photonics for data movement and specific operations where light excels (like matrix multiplication for AI), while using electronics for nonlinear functions and memory access, offering a compromise to deliver performance gains while the ecosystem matures.

  • Three technical challenges persist to enable all-optical computing:

    • Memory Integration: Optical computing lacks workable memory solutions. Current phase-change materials wear out after only 10,000-100,000 write cycles, while electronic memory lasts for quadrillions! (10^16+) of cycles. This fundamental memory problem prevents practical all-optical computers from being built, as any useful computing system needs reliable, long-lasting memory to store and retrieve data between operations.

    • Nonlinear Operations: While implementing nonlinear functions in optical systems presents challenges, recent research demonstrates promising paths forward. All-optical neural network training has been achieved using simple nonlinearities like saturable absorption and optical amplifier properties that can be implemented with a variety of materials. For AI inference especially, this is less problematic since matrix multiplication operations dominate the workload. The challenge now lies in scaling these techniques to commercial systems with appropriate energy efficiency and manufacturing yields.

    • Manufacturing Complexity: Photonic integrated circuits face challenges in manufacturability and integration, though these are not insurmountable. Free-space optical approaches benefit from established manufacturing ecosystems that already produce high volumes of optical products like projectors, switches, microscopes, and interferometers. While scaling these approaches to computing densities introduces different alignment considerations, substantial manufacturing experience exists. Meanwhile, TSMC's entry into silicon photonics manufacturing will accelerate ecosystem maturation for waveguide-based approaches.

  • Neat-term commercial opportunities in data center interconnects (1-3 years), with specialized accelerators in the mid-term (3-5 years), while general-purpose optical computing remains an unknown requiring breakthroughs in optical memory.

ubscribe

2. Thematic Analysis

Theme 1: The Pivot from Computing to Networking

Key Quotes:

"Everyone at some point has said they'll do processing... Lightmatter started like that, Light Intelligence, I think. Everyone knows it’s the big vision."

"I think from what I hear in the telecom industry, surprisingly, and the datacom industry seems negative to all optical computing."

Most pioneering photonics companies have strategically pivoted from computing to networking applications, following a clear market signal. This widespread shift reflects technical barriers and practical market realities. Companies like Lightmatter, LightIntelligence, and Celestial AI began with ambitious visions of all-optical computing but systematically redirected toward optical interconnects where they found market pull. This pivot isn't merely opportunistic—it's a recognition that networking applications leverage photonics' inherent strengths in data transmission while sidestepping the most challenging technical barriers in optical computing.

The economics are compelling and straightforward: while all-optical computing remains largely speculative, optical networking already demonstrates measurable performance and efficiency advantages over electronic alternatives. Data center bandwidth demands, driven by massive growth in AI, have created an immediate market need that photonics addresses.

This market-driven approach allows companies to commercialize their core technologies and establish revenue streams while the broader ecosystem matures. Many view networking as a strategic stepping stone—building manufacturing capability, supply chains, and customer relationships that may eventually support a return to more ambitious computing applications once fundamental barriers are overcome.

Theme 2: Memory as the Critical Bottleneck

Key Quotes:

"The big issue is memory. Being able to run large algorithms and get back and forth to memory in a reliable way."

"Volatile memory is not easy... We want to move out of the read-only memory region, and have instead of gigabytes of read-only memory, gigabytes of volatile memory. That is a big challenge."

The absence of practical optical memory represents the single most significant barrier to viable all-optical computing. While electronic memory has benefited from decades of refinement and manufacturing scale, optical memory remains largely theoretical or confined to laboratory demonstrations.

The challenge is fundamental: reliably storing and retrieving information optically requires solutions that combine sufficient density, speed, endurance, and manufacturability. Phase change materials (PCMs)—similar to those used in rewritable DVDs—represent the most promising approach, but current implementations hit a critical endurance wall, typically failing after 10,000 to 100,000 write cycles. This falls dramatically short of the billions of cycles required for practical computing applications.

There are three responses to this limitation:

  1. Architectural workarounds that minimize memory access requirements

  2. Application constraints focusing on read-only applications like small inference workloads where memory writes are infrequent

  3. Hybrid approaches that accept the conversion overhead of interfacing with electronic memory

Digital optical approaches show promise in implementing memory through optical feedback loops, but these face significant scaling challenges of their own. Until robust optical memory solutions emerge with sufficient endurance and density, truly all-optical computing will remain confined to specialized applications where the memory bottleneck can be circumvented through clever system design or application constraints.

Theme 3: Nonlinear Operations Pose Challenges

Key Quotes:

"They couldn't get rid of the main bottleneck, which was the electrons... The trick with nonlinear photonics is if you try to create an analog system, it becomes extremely difficult, because everything has to be so finely tuned."

"This is getting very difficult to scale because people are taking the same approach that ends out being a large matrix of hardware."

"You can do matrix multiplication. But you still have to fetch from memory somewhere. So there's the fetching. And you can't do activation functions."

Implementing nonlinear operations in the optical domain presents challenges, though recent research shows promise. Unlike electronics, where nonlinearity is inherent to semiconductors, light typically follows linear principles in conventional materials. However, researchers have demonstrated various approaches to optical nonlinearity, including using saturable absorption and optical amplifier nonlinearity for all-optical neural network training.

Nonlinear operations are essential for modern computing—particularly neural networks. Without nonlinear activation functions like ReLU or sigmoid, neural networks would collapse into simple linear models with severely limited capabilities. Even basic Boolean logic requires nonlinear responses.

The industry has developed several approaches to address this challenge, each with different trade-offs:

  1. Material-based nonlinearities using saturable absorption, optical amplifiers, and other effects that can be implemented across a variety of material systems with demonstrated success in laboratory settings.

  2. Hybrid electro-optical systems that convert to electronics for nonlinear functions, accepting conversion overhead in exchange for system simplicity.

  3. Digital optical approaches that implement nonlinearity through discrete optical components, facing scaling limitations due to component size.

While laboratory demonstrations show the feasibility of all-optical nonlinear operations, scaling these solutions to commercial systems with appropriate energy efficiency, speed, and manufacturing yield remains an active research area. This explains why successful optical computing implementations have focused on operations where photonics naturally excels—like matrix multiplication and Fourier transforms—while often delegating nonlinear operations to electronic components in current commercial approaches.

For AI inference applications specifically this challenge is less problematic since matrix multiplication dominates the computational workload while activations represent a smaller portion, allowing specialized photonic architectures to deliver significant performance advantages even with hybrid approaches to nonlinearity. Lightmatter is likely SOTA here, as per this latest paper, the chip can run ResNet, BERT, SegNet, and Atari reinforcement learning games like Pacman. However, the system struggles with regression tasks requiring high precision (achieving only 27.5% of standard performance), suffers from optical power constraints due to silicon waveguide nonlinear absorption, and can only accommodate relatively small models within its 268MB memory without partitioning across multiple units.

Theme 4: Digital vs. Analog Optical Computing

Key Quotes:

"The trick with nonlinear photonics is if you try to create an analog system, it becomes extremely difficult... That's one of the reasons why we decided to go into the digital domain."

"This is the same reason that D-matrix and Rain started as analog in memory compute and moved to digital. Analog is just really hard for noise, precision, and error correction."

A fundamental philosophical divide separates two distinct approaches to optical computing: analog systems that harness the continuous properties of light versus digital systems that implement discrete binary logic. Analog optical computing offers remarkable theoretical advantages—it can utilize multiple physical dimensions of light simultaneously (amplitude, phase, polarization, wavelength) to perform complex operations like matrix multiplication in a single physical step with minimal energy. However, these systems are inherently susceptible to noise, manufacturing variations, and environmental factors like temperature changes that can significantly degrade computational precision.

Digital optical approaches instead implement binary logic using optical components—trading some of photonics' natural advantages for greater resilience and manufacturability. This approach more closely resembles conventional electronic computing architecture but faces scaling challenges due to the diffraction limit that constrains how small optical components can be made.

This division reflects deeper considerations beyond technical tradeoffs. Analog approaches may offer faster paths to specialized accelerators for specific high-value functions, while digital approaches could eventually enable more general-purpose optical computing. The industry hasn't reached consensus on which path will ultimately prove more viable, with both approaches showing promise for different application domains.

The tension mirrors the early history of electronic computing, which initially used analog approaches before standardizing on digital architectures due to their superior reliability and programmability. Whether optical computing will follow the same evolutionary path remains an open question.

It’s important to map these paradigms to the hardware stack. At the device layer, optical modulators and detectors behave very differently in analog versus digital regimes. Control circuitry, error correction, and software abstraction layers also vary widely depending on which model is chosen. This divergence cascades upward into toolchain design, developer experience, and achievable application types.

Theme 5: Manufacturability and Integration Challenges

Key Quotes:

"The main challenge for any photonic technology is to validate high enough reliability. You can make nice prototypes and proof of concepts, but to create a product, you need to demonstrate that your technology can operate for three to five years or more."

"The biggest challenge is that the design tools are not representative of manufacturing defects... There might be design decisions which seem reasonable in simulation but just cannot work with our fabrication stack."

Beyond fundamental physics challenges, practical considerations in manufacturing, packaging, and system integration create barriers to commercial viability for photonic computing, though these are not without established solutions in certain domains.

Photonic integrated circuits present distinct manufacturing considerations compared to their electronic counterparts. Nanometer-scale variations in waveguide dimensions can alter optical performance, affecting yield and production costs. However, the optical industry has well-established manufacturing approaches in several domains. Free-space optical systems benefit from mature manufacturing ecosystems that already produce high volumes of optical products including projectors, switches, microscopes, and interferometers—demonstrating that optical manufacturing can achieve commercial scale and reliability.

The challenges differ depending on the approach. Integrated photonics requires precise waveguide fabrication, while free-space optics faces alignment and packaging considerations at computing densities. Both approaches require interfaces between optical and electronic domains with precise alignment, stable thermal management, and specialized packaging techniques.

Design tools for photonics are still evolving compared to electronic design automation tools, though significant progress has been made. Test infrastructure is developing rapidly, with companies establishing specialized equipment and procedures for photonic components.

TSMC's entry into silicon photonics manufacturing represents a pivotal development that will drive standardization and economies of scale similar to those that transformed electronic semiconductor manufacturing. This will accelerate the industry's progress toward manufacturability for waveguide-based approaches, while free-space systems can leverage existing optical manufacturing expertise.

These manufacturing and integration considerations have direct implications for which applications become commercially viable first. Applications with higher performance requirements can tolerate higher initial costs and may reach commercial viability earlier, while mass-market applications will benefit from further manufacturing maturation to achieve optimal price points.

Share

3. Consensus & Disagreement

Points of Strong Agreement:

  1. Data center is the primary near-term market driver Most interviewees agreed that AI and data center applications provide the strongest market pull for photonic technologies, with bandwidth limitations and power constraints creating demand for optical solutions.

  2. Interconnects are gaining market traction faster than computing Nearly all participants acknowledged that photonic interconnects and networking are finding commercial applications more rapidly than optical computing, with several major players making significant investments.

  3. Silicon photonics manufacturing is maturing rapidly There was broad consensus that the industry is seeing significant improvement in silicon photonics manufacturing capabilities, with TSMC's entry being particularly significant.

Notable Contradictions:

  1. Viability of all-optical computing While most interviewees expressed skepticism about near-term all-optical computing, some companies remain confident that fundamental barriers can be overcome, particularly through digital optical approaches rather than analog ones.

  2. Target applications and architectures Companies diverge significantly on the best applications for photonic computing/processing, from general-purpose computing to highly specialized accelerators for specific functions.

  3. Memory solutions Different approaches to the memory challenge were advocated, from pure optical memory research to clever architectures that minimize memory access or leverage different memory hierarchies.

  4. Edge vs. Data Center focus While most companies target data center applications, there were significant disagreements about the viability of edge optical computing or processing, with some seeing it as a promising direction and others viewing it as having insufficient market pull.

Explanatory Factors:

The divergence in perspectives appears driven by:

  • Technical background (electronic vs. optical engineering)

  • Business model considerations (need to demonstrate near-term revenue vs. pursuing longer-term breakthroughs)

  • Investment environment (pressure to align with AI narrative)

  • Specific technical approaches (digital vs. analog; specialized vs. general-purpose)

4. Unexpected Findings

  1. Defense/EMI resilience applications. Several interviewees noted that optical computing's immunity to electromagnetic interference makes it particularly valuable for defense applications and environments with high EMI, a use case that isn't frequently discussed in public materials.

  2. Batch processing advantage for larger models. Contrary to the intuition that smaller AI models would be better suited to optical processing (due to memory constraints), one company explained that their approach actually works better with larger models because "compute scales quadratically and the conversion overhead linearly."

  3. Active interposers as computational elements. A vision where optical interposers could eventually become active computational elements rather than just passive interconnects was described, blurring the boundary between networking and computing in an unexpected way.

  4. DVD technology as optical memory precedent. This was news to me and a cool fact. Some pointed to DVD-R/RW technology as evidence that optical memory is viable, noting that phase change materials used in rewritable optical discs could be adapted for on-chip optical memory, providing a read speed advantage over electronic memory. But endurance is the watch word here.

5. Industry Progress

The industry has been progressing slowly, and it’s reached a point now where you have to look around and say, oh yeah, it’s 100% happening, especially with recent progress on the fab and Nvidia sides.

TSMC’s Silicon Photonics Manufacturing Roadmap

TSMC has expanded its silicon photonics initiatives by introducing the COUPE (Co-Packaged Optical I/O Using Photonics Embedded) platform, aiming to deliver 12.8 Tbps on-package interconnects. This development enhances data transfer rates between processors and memory, addressing the growing demands of AI and high-performance computing workloads.

Broadcom’s Hybrid Electro-Photonic Strategy

Broadcom has advanced its Co-Packaged Optics (CPO) technology with the introduction of the Bailly switch, a 51.2 Tbps Ethernet switch that integrates silicon photonics-based optical engines with the Tomahawk 5 switch chip. This design achieves a 70% reduction in power consumption compared to traditional pluggable transceiver solutions, effectively addressing data center challenges by enhancing bandwidth density and reducing energy requirements.

NVIDIA’s Data Center Optics Initiative

At GTC 2025, NVIDIA unveiled the NVL576 rack, powered by the Vera Rubin Ultra SuperChips, marking a significant advancement in high-density, liquid-cooled AI systems. This system incorporates co-packaged optics to improve power efficiency and bandwidth, supporting the scaling of AI data centers.

Intel’s Integrated Optical I/O Chiplet

Intel has demonstrated the industry’s first fully integrated optical compute interconnect (OCI) chiplet co-packaged with an Intel CPU, running live data. This advancement is expected to revolutionize high-speed data processing for AI infrastructure by enabling co-packaged optical input/output in emerging AI infrastructure for data centers and high-performance computing applications.

These developments underscore the accelerating integration of photonic technologies into mainstream computing, driven by the need for higher bandwidth, lower latency, and improved energy efficiency in data centers.

6. Timeline

Based on the 2024-2025 progress, i’ve accelerated my timeline on photonics networking and opto-electronic computing. All-optical still requires a breakthrough that I can’t put a timeline on.

  • Near-term (1-3 years): Optical networking and interconnects for data centers represent the clearest commercial opportunity. Companies in this space are already generating significant revenue, particularly those enabling AI infrastructure scaling. Broadcom's CPO roadmap confirms this trajectory, targeting 51.2 Tbps switching capacity through hybrid electro-photonic integration. NVIDIA's recent focus on optical technologies for GPU clusters further validates the immediate value proposition of photonic interconnects for AI workloads.

  • Mid-term (3-5 years): Co-packaged optics and chip-to-chip optical communication will gain significant traction as TSMC's silicon photonics manufacturing capabilities mature (expected by 2025-2026). Broadcom's progression toward 102.4 Tbps switching through increasingly integrated photonic approaches exemplifies this timeline. Specialized optical accelerators for specific functions (like FFT acceleration and matrix multiplication for neural networks) will emerge as manufacturing yields improve and integration challenges are addressed.

  • Long-term (7+? years): 7 years is long term now lol. But broader optical computing applications remain dependent on solving fundamental technical challenges, particularly around optical memory. While major players are investing in photonic R&D, their current roadmaps focus primarily on interconnect applications rather than full computing replacement. Intel, Meta, Google, and Microsoft deployments of optical networking infrastructure provide a foundation for eventual computing applications, but their published strategies confirm the industry consensus that networking applications will precede computing ones.

7. Investment Implications

  1. Optical memory breakthroughs represent transformative potential: Companies developing next-generation optical memory solutions, particularly advanced phase change materials like Ge2Sb2Te5 (GST), As2S3, and Sb2Se3, could unlock the broader potential of optical computing. Research from MIT, IBM Zurich, and Oxford shows promising approaches to overcome the endurance limitations that have historically constrained optical memory, with recent demonstrations achieving up to 108 switching cycles, though still below the 10¹6+ cycles of electronic memory.

  2. Optical networking represents lower risk: Companies enabling AI data center scale-out through optical networking solutions have clearer paths to market and fewer technical barriers. The market for optical transceivers alone is projected to exceed $12B by 2026, driven largely by AI infrastructure demands, with companies like Celestial AI and Ayar Labs already securing significant commercial partnerships.

  3. Specialized accelerators over general computing: Targeted optical acceleration of specific functions (e.g., FFT, matrix multiplication) is more viable than general-purpose optical computing in the near term. These specialized approaches can deliver 10-100x improvements for specific operations while sidestepping the nonlinearity and memory challenges that plague general-purpose optical computing, creating discrete market opportunities in areas like encryption, scientific computing, and specific AI inference workloads.

  4. Watch manufacturing maturity: TSMC's commitment to silicon photonics will significantly improve the ecosystem, potentially enabling new applications as manufacturing capabilities mature. The economics of photonic integrated circuits follow similar scaling laws to electronic semiconductors, suggesting that once volumes increase and processes standardize, costs could decrease dramatically – creating opportunities for companies positioned to leverage these manufacturing improvements.

  5. European sovereignty: Companies that can position photonic technologies as strategic for technological independence may find additional support, this is especially true since Trumps’s Tariffs blew up the idea of a global supply chain. The EU's €100+ million investment in photonics through Horizon Europe and national initiatives reflects a strategic commitment to developing sovereign capabilities in this technology, creating funding and partnership opportunities.



Request for Startups: Optical Memory

Optical memory represents perhaps the most compelling opportunity in the entire photonics landscape—a genuine "holy grail" that could catalyze widespread adoption of photonic computing. By eliminating the need to constantly convert between optical and electronic domains for data storage, breakthrough optical memory solutions would unlock dramatic improvements in both performance and energy efficiency across the computing stack. All opto-electronic chips are finding clever ways to avoid going to memory or using small algorithms that fit in like 256MB.

The most promising R&D directions span multiple technological approaches. Phase-change materials (PCMs) like chalcogenide compounds offer one path forward, with researchers working to overcome endurance limitations while maintaining rapid switching times. Alternative approaches include engineered photonic crystal cavities that can trap light in stable resonant states, magneto-optical materials that combine magnetic and optical properties for non-volatile storage, and innovative waveguide structures incorporating quantum dots or rare-earth dopants for multi-state storage capabilities.

These diverse approaches share common challenges around material stability, switching energy, and silicon photonics compatibility, but each offers distinct advantages that could prove decisive.

I don’t know which approach is offers the optimal trade-offs. But I want to speak to anyone who thinks they have a solution for solving optical memory and making all-optical computing possible for the first time.

Appendix

This analysis combined secondary research with primary interviews from interviews with leaders across the photonic computing ecosystem, including but not limited to:

Report TOU ViolationShare This Post
 Public ReplyPrvt ReplyMark as Last ReadFilePrevious 10Next 10PreviousNext