From Tom's Hardware, April 16:
How two companies are using novel approaches to transfer quantum Qubits.
This article is part of a series documenting quantum computing technologies and their ecosystem – the differing approaches, the key players behind them, and the key technologies that are driving us towards a quantum future. Part one looked at superconducting qubits (materialized in key industry giants such as IBM and Google) and trapped ion qubits (through IonQ and Quantinuum).
In this second part, we’ll be looking at quantum photonics – a light-based technique of defining the quantum unit of computation, the qubit. We’ll take a brief look at the what and the why of quantum photonics, and then materialize it by focusing on two particular companies, their roadmaps, and their technologies: Toronto-based Xanadu Quantum Technologies (which is making a play for public Nasdaq listing this first quarter of 2026 at an estimated 3.6B$ enterprise valuation through a SPAC deal); and the Palo Alto, California-headquartered PsiQuantum (PSIQ.PVT, with an estimated 7B$ valuation buoyed by a 1$ billion worth Series E funding round in late 2025).
Like our previous roadmap analysis, this won’t be a technical article; it’s a technology and roadmap analysis that brings understandable bites on the underlying technologies, their roadmap evolution, current state, and expected next steps. For a better understanding of what quantum computing is all about, Tom’s Hardware has a more explanatory quantum computing article you can familiarize yourself with first.
What is Quantum Photonics?
To answer what quantum photonics actually is, we have to start with the most basic: photonics is the use of light to transmit encoded information. The most widespread application of photonics that’s already a part of our infrastructure today materializes through fiber optic cables: within them, light travels at its speed (which matters for latency) and crucially, without energy losses to electrical resistance.Because light can contain multiple wavelengths (think colors, ranging through the visible spectrum and beyond), information in fiber optic cables can be encoded in multiple paths within the same ray (a technique known as multiplexing) for increased bandwidth.
This classical approach to photonics uses billions of photons (the essential unit of light) in coherent beams, using other elements such as phase and polarization as data carriers. Classical photonics is already a well-known quantity, with multiple applications in both intercontinental information transit, data center interconnects, and more specifically, inter-chip communication.
The transition towards the quantum realm occurs when you stop looking at light as a beam and focus on the singular elements that compose it: photons. Quantum photonics, then, makes use of single-photon sources and single-photon detectors to encode and decode information through the specific strengths of quantum properties: entanglement (where two entangled photons become a coherent system) and superposition (where the universe of possible information values can be contained in a single qubit until interfered with).
This brings us to the great differentiator in current quantum photonics: the way operations are run on individual photons, and how information is encoded within them. PsiQuantum uses what’s known as a dual-rail encoding approach: informational states are derived from looking at a photon’s “choice” between path A (0) and path B (1) (these paths being known as waveguides). Xanadu approaches it through the lens of continuous-variable encoding: instead of looking at the photon itself, it looks at the photon’s light field and how it’s distributed (across properties like amplitude and phase), ‘squeezing’ them (reducing uncertainty in the amplitude variable at the cost of increased uncertainty in phase) to encode data.
These are two fundamentally different ways of obtaining the result of a photonics-based, large-scale, error-corrected quantum computer, each with its own set of engineering problems. The end-goal, however, is the same: when you can generate, manipulate, and measure individual photons, light stops being a mere transmission medium, and individual particles become the computational substrate itself.
Advantages, challenges, and the mechanics of photonic qubits
Quantum photonics is claimed to have some operational advantages over other approaches: unlike superconducting qubits, photons can be operated on at room temperature, theoretically reducing both installation, running, and maintenance costs.The natural physical makeup of photons also means that photonic qubits are less susceptible to environmental interference, such as electromagnetic noise and thermal fluctuations. Scaling-wise, photonics-based chips can leverage semiconductor manufacturing infrastructure, and the natural speed of light means that gate times (gate operations being the result of inter-qubit operations towards a useful result) should have a higher operational limit compared to other approaches, such as trapped ions.
There’s always an opportunity cost in each quantum approach, however. In PsiQuantum’s dual-rail approach, identical photons that can be reliably entangled are very hard to generate: minute differences in wavelength, polarization, and spatial modes destroy systemic equilibrium and reliability. Photon generation (which is usually accomplished by shining a laser through a crystal) is a probabilistic operation: sometimes no photon is generated; sometimes, one is; and sometimes, more than that.
All of this leads us to the harsh truth that in quantum photonics - particularly in its dual-rail design - it’s easy to lose more than 90% of the generated photonic qubits (at generation or collection) before they ever get a chance to perform a useful computation. This means that to generate a 100-qubit photonic system, upwards of 10,000 photons must be generated. Everything else is lost.
PsiQuantum’s way of operating on individual photons means there’s no informational backup, such as what you’d get when operating on classical light beams: when the photon is lost, everything is. You can amplify billions of photons when they are a beam, but you can’t do the same for a single photon (a quirk of quantum mechanics known as the no-cloning theorem). And being incredibly small particles, a minute error in the photon’s directionality means that the emitted particle can easily fail to be detected on the other end (think of how a small angular difference at a bullet’s exit compounds on missing the bullseye).
Xanadu’s approach, on the other hand, sidesteps the requirement for photonic “perfection” at generation and is more tolerant to photon loss (the light fields don’t completely vanish on individual photon loss). But it does introduce different error correction challenges – errors are continuous (noise is present in amplitude and phase measurements), while PsiQuantum’s issues are discrete (photon present vs photon absent, resulting in discrete bit flips in calculations).
Clearly, the base technology of photonics can serve very different approaches. PsiQuantum bets that silicon photonics manufacturing can overcome the drawbacks of their dual-rail approach through scale and engineering precision to reduce errors and improve photon measurement reliability, while Xanadu’s intrinsically higher tolerance to process imperfections enables a faster timeline to quantum advantage, or so they hope....
....MUCH MORE, they go deep.
Possibly also of interest, at Barron's: