The Photonic Pivot

Why AI Is Running Out of Electricity

We experience artificial intelligence as something almost immaterial — a chatbot on a screen, a model in the cloud, a stream of answers arriving without friction. Yet behind that illusion lies one of the most energy-intensive infrastructures ever built. Warehouses of processors consume gigawatts of power, rivers of cooling water and supply chains stretching across continents. The intelligence may appear weightless; the machinery that produces it is anything but.

For decades, scaling computing meant adding more transistors and more machines. When demand rose, engineers expanded clusters, increased clock speeds and optimized software. The assumption was simple: computation was the scarce resource. Today, that assumption has inverted. Modern AI systems are no longer constrained primarily by how fast they can calculate, but by how fast they can move data — and how much heat that movement generates.

Cooling systems in Northern Virginia, power grids in Texas and new data-center corridors across the American Midwest are already operating near structural limits. Training a frontier AI model can consume as much electricity as a small town. Even running those models at scale threatens to outpace local energy capacity. The digital economy has collided with thermodynamics.

“We have reached the point where the cost of moving an electron is far greater than the value of the computation it performs. The physics of the electron has become the tax on the progress of human intelligence.”
— Dr. Bill Dally, Chief Scientist and Senior VP of Research, NVIDIA

What the industry calls a compute shortage is, in reality, an infrastructure crisis. The bottleneck has shifted from thinking to transport. Processors idle while waiting for data to arrive. Networks saturate long before arithmetic units do. The limiting factor is no longer the speed of logic, but the friction of electrons moving through metal. Engineers increasingly describe this barrier as the “thermal wall” — a point at which additional performance produces disproportionate heat and energy costs.

If the twentieth century was defined by mastering electrons, the next phase of computing may depend on abandoning them — at least for communication. Light, carried by photons rather than charged particles, offers a radically different physics: near-zero resistance, minimal heat generation and the ability to transmit enormous volumes of data simultaneously. What began as a telecommunications technology is now emerging as a candidate foundation for computation itself.

The Respectful End of the Copper Era

Silicon and copper are not failing technologies. They are victims of their own success. Modern processors contain tens of billions of transistors operating at astonishing speeds, yet their performance increasingly depends on signals traveling across microscopic wires that obey the same physical laws as century-old electrical grids.

Every electron moving through copper encounters resistance. That resistance converts energy into heat — the “heat tax” paid for computation. At small scales the cost is manageable. At the scale of hyperscale AI clusters, it becomes dominant. Billions of operations per second translate into megawatts of thermal output that must be removed continuously to prevent system failure.

Equally problematic is congestion. In large GPU clusters, chips spend a significant portion of time waiting for data from neighboring processors or memory systems. The queues forming along copper interconnects resemble traffic jams on a highway network designed for a previous era. Adding more processors does not eliminate the bottleneck; it often intensifies it.

This phenomenon, sometimes called the interconnect crisis, marks a shift in computing architecture. For decades, improvements in transistor density compensated for communication inefficiencies. Today, the cost of moving information — across a chip, between chips and across data centers — dominates total energy consumption. The system is limited not by the engine, but by the roads.

Photonics: A Different Physics of Information

Photons behave fundamentally differently from electrons. They carry no electric charge, encounter negligible resistance in optical media and generate minimal heat. Most importantly, they can travel long distances without losing speed or integrity and multiple wavelengths can coexist within the same channel — effectively allowing parallel data streams in a single fiber.

This is why global telecommunications migrated to fiber optics decades ago. What is new is the attempt to bring that optical paradigm inside the computer itself — onto circuit boards, into processor packages and ultimately onto the chip.

“The electron is a social particle; it interacts with everything. That’s great for making a switch, but terrible for high-speed communication. Photons, on the other hand, are ‘antisocial’ — they don’t interact with each other, which is exactly why they can carry more data with zero heat.”
— Dr. David A.B. Miller, W.M. Keck Foundation Professor of Electrical Engineering, Stanford University

The emerging vision is not purely optical computing replacing electronics overnight. Instead, it is a hybrid architecture in which electrons handle logic operations while photons handle transport — and, in some cases, specialized forms of computation. This division of labor mirrors biological systems, where electrical signals process information locally while chemical or optical signals coordinate over distance.

Two Pathfinders: Rebuilding the Brain and the Nervous System

Within the United States, two companies have become emblematic of this transition, each attacking a different layer of the problem.

Lightmatter — Reinventing the Processor

Lightmatter is attempting something historically rare: redesigning the computational core itself. Its photonic processors use optical interference to perform matrix multiplications — the mathematical operations at the heart of neural networks — at extraordinary speed and energy efficiency.

“If you look at the way AI is growing, we’re on a collision course with the power grid. Computers are using more and more of the world’s energy, and the reason is the heating of wires. Lightmatter is about using light to do the same things that we’ve been doing with electricity, but much faster and with much less energy.”
— Nicholas Harris, Founder and CEO Lightmatter

In Altair terms, Lightmatter is trying to rebuild the brain’s core — processing information with a new physical substrate. If successful, such systems could dramatically reduce the energy cost per AI operation and enable models far larger than today’s hardware can sustain.

Ayar Labs — Rebuilding the Nervous System

Ayar Labs addresses a more immediate constraint: communication between chips. Instead of replacing processors, it replaces the copper links connecting them with optical I/O that transmits data as light directly from package to package.

“The bottleneck in AI isn’t how much you can compute, it’s how much data you can move between chips. We’ve reached the limit of what copper can do. Optical I/O is the only way to break the ‘bandwidth wall’ and allow AI clusters to scale into the next decade.”
— Mark Wade, Co-founder and CTO Ayar Labs

If Lightmatter redesigns the brain, Ayar Labs redesigns the nervous system — allowing existing GPUs to communicate over far greater distances without latency or energy penalties. This approach may reach mass deployment sooner because it preserves current processor ecosystems while removing the scaling bottleneck.

Communication Over Computation

The historical narrative of computing emphasizes faster processors, but modern AI systems reveal a different reality: most energy is spent moving data rather than transforming it. Memory access, chip-to-chip transfers.and network communication dominate power consumption in large clusters.

In practical terms, this means that doubling computational speed yields diminishing returns if communication channels remain unchanged. A processor capable of extraordinary performance is useless if it spends most of its time idle, waiting for information to arrive.

This insight reframes the race for AI supremacy. The decisive advantage may belong not to the nation or company with the fastest chips, but to those who can move information most efficiently across vast infrastructures.

Energy, Economics and the Sustainability of Intelligence

The implications extend beyond engineering. Energy availability is becoming a geopolitical variable in AI development. Regions with abundant electricity, favorable climate for cooling and stable grids gain structural advantages. Conversely, uncontrolled growth in data-center demand risks straining public infrastructure and provoking regulatory backlash.

Optical technologies promise substantial reductions in energy per bit transmitted and per operation performed. If realized at scale, they could slow the exponential growth of AI’s power consumption and make advanced systems economically sustainable. Without such improvements, the marginal cost of intelligence may become prohibitive.

The Geopolitics of the Photonic Stack

Control over this emerging infrastructure is unevenly distributed. American firms dominate AI software and system design. Yet the tools required to manufacture advanced chips — extreme ultraviolet lithography, precision materials and photonic integration expertise — depend heavily on European and Asian ecosystems.

“Photonics is to the 21st century what electronics was to the 20th. The Netherlands has a unique advantage; we control the entire value chain, from design to production.”
— Eelko Brinkhoff, CEO PhotonDelta

“The integration of optics and electronics on a single chip is one of the most significant engineering challenges of our time.”
— Christophe Fouquet, CEO ASML

This creates a new form of interdependence. The United States may lead in defining the demand for photonic computing, while Europe supplies critical machinery and Asia provides manufacturing scale. Strategic autonomy in the AI era will therefore depend not on any single nation’s capabilities but on the stability of this transnational supply chain.

A Gradual Transition, Not a Sudden Revolution

Despite the transformative rhetoric, photonic computing will not replace electronics overnight. The likely trajectory is incremental:

  1. Optical interconnects within data centers
  2. Hybrid electronic-photonic modules
  3. Specialized photonic accelerators
  4. Potentially broader optical computing architectures

Each stage reduces energy consumption and increases bandwidth while maintaining compatibility with existing software and hardware ecosystems.

The Age of Light Enters the Machine Room

Artificial intelligence is often framed as a competition of algorithms or data sets. Increasingly, it is a competition of physical infrastructures — power grids, fabrication plants, cooling systems and communication networks. The decisive breakthroughs may come not from new mathematical insights but from new ways of moving energy and information through matter.

If electrons defined the digital age, photons may define the age of artificial intelligence. The shift will be largely invisible to end users, yet it could determine which nations and companies can continue scaling intelligence without overwhelming their energy systems.

The future of AI may not hinge on teaching machines to think better, but on enabling them to communicate at the speed of light.

Illustration: Altair Media / AI-generated (pencil drawing)
Conceptual depiction of electronic and photonic chips connected by optical data links, symbolizing the shift from electrons to photons in AI infrastructure.


Further reading
The Age of Light — Meaning, Machines and the Physics of Intelligence explores how photonics, energy systems and physical infrastructure are reshaping artificial intelligence and global power in the 21st century.

Available worldwide on Amazon (Kindle Edition);
https://www.amazon.com/dp/B0GMXLX56T

Leave a Reply

Your email address will not be published. Required fields are marked *

About us

Altair Media US explores the forces shaping markets, technology and economic transformation in the United States and beyond. Through independent analysis and strategic perspectives, we examine how capital, innovation and industry define the global economy.
📍 Based in Europe – with contributors across the US
✉️ Contact: info@altairmedia.eu