AI Decoder Can Cut Quantum Computing Errors by Up To 17 Times

By

A neural network-based decoder created by Harvard University researchers has the potential to drastically alter the timescale for practical quantum computing. The team has discovered a “waterfall” effect using artificial intelligence that reduces mistake rates in quantum computing and raises the possibility that the enormous qubit counts previously believed to be required for quantum “supremacy” may be overstated.

Qubits, which are extremely powerful but infamously brittle, are the foundation of quantum computers. They are extremely susceptible to environmental noise and interference, which leads to computation errors. The system employs “error correction” to identify and correct errors in real time in order to address this.

This is immediately addressed by the latest AI system, Cascade, a convolutional neural network. In benchmark testing, Cascade lowered error rates by several thousand times and processed data up to 100,000 times faster than conventional methods, according to a report published on the pre-print service arXiv.

What scientists refer to as the “Waterfall effect” is possibly the most shocking discovery. Conventional models anticipated that as systems evolved, error rates would gradually improve. In spite of that, the researchers from Harvard found out that error rates start to go down far more sharply than anticipated after they fall below a specific threshold.

Cascade’s single-shot latency, or how long it takes to complete one round of correction, is measured in millionths of a second, according to the researchers. Several cutting-edge quantum platforms, such as trapped-ion as well as neutral-atom systems, are already suitable with this speed. The team identified several trade-offs in spite of the euphoria.

AI-based decoders heavily depened on the quality of their training material and do not yet have the same theoretical warranties as compared to traditional algorithms. In addition, lesser AI models did not perform sufficiently well, which highlightes that a large amount of processing power is essential for high-performance decoding.

Nonetheless, the discovery implies that quantum computers might not require as many qubits as previously believed to achieve practical performance.

Share This Article
wpDiscuz
Exit mobile version