QuiX Quantum Demonstrates Substantial Error Reduction in Photonic Quantum Computing for the First Time

Inside Short

  • QuiX Quantum has demonstrated “below-threshold” error reduction in photonic quantum computing, achieving absolute error reduction and meeting critical levels required for scalable, fault-tolerant applications.
  • Using a custom 20-mode photonic processor, the team used a photon distillation method that reduced undifferentiated photon errors by a factor of 2.2 and provided an overall 1.2× net reduction despite increased gate noise.
  • The results suggest that combining this method with quantum error correction could reduce hardware requirements—perhaps reducing the number of photon sources per logical qubit by four times—while supporting scalable photonic quantum computing.

PRESS RELEASE – QuiX Quantuma leading provider of photonic quantum computing devices, today announced that it has demonstrated “sub-threshold” reduction for the first time in a photonic quantum computer, suppressing physical errors to a level consistent with a scalable, fault-tolerant computer.

The success marks the first time a European company has demonstrated a production-ready approach to error reduction, demonstrating the breakthrough of the photonics-based QuiX quantum computing platform. The project was developed on the QuiX BiaTM Cloud Quantum Computing Service in collaboration with NASA’s Quantum Artificial Intelligence Laboratory, the University of Twente, and the Freie Universität Berlin.

Quantum information is limited, and without error correction, it is possible to have a quantum of magnitude relative to the user. For this reason, the ability to control errors at the quantum level is seen as an important step for any competitive computing platform. Increasingly, experts consider the ability to handle such errors to be a key differentiator between different technologies.

In order for such a protocol to be meaningful, it must meet two conditions: it must remove more errors than it produces, and it must not interfere with the operation of other computers. QuiX is the first party in photography to demonstrate a protocol that meets both requirements simultaneously. The findings are described in a paper available at https://arxiv.org/abs/2601.05947 which is currently under peer review.

“Below the limit, reducing physical errors has never been used in a photonic quantum computer. This achievement marks an important step and puts QuiX Quantum at the forefront of the development of error tolerance in photonic quantum computing,” said Stefan Hengesbach, CEO of QuiX Quantum. “We believe that the most effective strategy is to reduce errors early rather than fixing them at great cost – and by demonstrating good error reduction in real devices, we have taken a key step in demonstrating European leadership in accelerating mass technologies to robust, large-scale applications.”

“This paper represents a significant leap forward in photonic quantum computing,” said David DiVincenzo, director of the Center for Theoretical Nanoelectronics at the Peter Grünberg Institute at Forschungszentrum Jülich. “Using Multimode Optical Fourier transformation, the authors have experimentally developed an efficient photon distillation scheme that will significantly reduce the cost required for a future photonic quantum processor.”

Photonic quantum computers use photons – particles of light – as carriers of information. Photons bounce around the optical chip and collide with each other due to their quantum particle statistics. However, the sources that generate these particles are imperfect, and any path information contained in the particles will distort the capture, resulting in identification errors.

Photon distillation is a hardware-level, stable error reduction method that optimizes the quality of a single photon prior to computation. Exploiting quantum entanglement among many imperfect photons, this method results in a pure photon, which cannot be interpreted without heavy qubit redundancy or classical postprocessing.

Using a 20-mode photonic processor, the team demonstrated a photon distillation gate that makes the photon more uniform, reducing the error of the undifferentiated photon by a factor of 2.2. Despite the additional noise introduced by the gate, the device still provides a 1.2X reduction in absolute error, indicating a reduction in gain.

The research also shows that combining photon distillation with quantum error correction can significantly reduce the system’s hardware requirements. To model the latest photon source operation and photonic architecture, this method can reduce the number of photon sources needed per logical qubit to four, reducing system complexity and cost.

“For any computing system to improve, it has to prove that it can remove more errors than it adds while the computer is running, and that’s what we’ve shown here,” said Jelmar Renema, Chief Scientist at QuiX. “Our photon distillation gate is synchronized using real numbers and provides a reduction in net gain errors once all gate noise is included. That’s why this is a huge breakthrough for photonics and quantum computing in general. ”

This project is partially funded by the Netherlands Ministry of Defense’s Purple NECtar Quantum Challenges project.

#QuiX #Quantum #Demonstrates #Substantial #Error #Reduction #Photonic #Quantum #Computing #Time

Leave a Comment