Home » IBM unveils the new 133-qubit Heron quantum processor: why it’s important and what it’s for

IBM unveils the new 133-qubit Heron quantum processor: why it’s important and what it’s for

by admin
IBM unveils the new 133-qubit Heron quantum processor: why it’s important and what it’s for

Al Quantum Summit in New York, IBM unveiled the new Heron quantum processor and the System Two quantum computer, successor to 2019’s System One.

The processor follows the Eagle e model passes from 128 to 133 qubit and represents the first practical commercial application of IBM’s latest discoveries in the field of error and noise mitigation in quantum computing solutions.

The announcement is particularly important because, according to IBM, Heron represents “a fundamental step forward” towards what the American company calls “the era of quantum utility”. Till today quantum computers are still in the promise stage: we know that (on a theoretical basis) they could be used to solve problems in a few seconds that would require years and years of calculation by a classic supercomputer. However, concrete applications are still in short supply and some practical problems, such as errors, scaling up and mitigation of the so-called quantum noise which affects the quality of the results, remain the major obstacles to overcome.

Nonetheless, IBM believes that already today, with solutions like Heron and the System Two modular quantum supercomputer, these systems can offer an effective tool for research, particularly in the fields of energy physics and materials chemistry. By extension, this also means a direct commercial application, not only in academic but also industrial settings.

Young Innovators Forum Under 35s are not afraid of AI: for young entrepreneurs, the future lies in artificial intelligence by Emanuele Capone 06 June 2023

Reduced noise and fewer errors

In June the American company had published in Nature a paper to demonstrate that quantum computers can “produce precise results at the scale of 100+ qubits outperforming the best classical approaches,” based on experiments conducted on the 127-qubit Eagle processor. The solution proposed and implemented by IBM researchers and their scientific partners involves learning and reducing errors and noise. Which is the term by which we mean everything that it interferes with the qubits’ ability to maintain their state quantum, including stray atoms that disturb them, vibrations or temperature changes. Such interference causes qubits to lose coherence, that is, to lose the quantum properties that give them the foundation of supercomputing capability.

See also  The Last Hope for Nintendo Switch: Deadzone Survival Removed from e-Store Due to Shameless Rip-Off Claims

I qubit, to give a quick explanation, are the quantum equivalent of the bits of classical computing: unlike these, which can only have the value 0 or 1, qubits can exist in a superposition of states, that is, they can simultaneously represent both 0 and 1 a combination of both values. This allows quantum computers to process a huge amount of information in parallel, potentially increasing their speed and computing capacity for some specific classes of problems unsolvable in human times with classical computing. Qubits are also characterized by a phenomenon called entanglement, which allows them to express correlations between themselves that are precluded from the bits of classical computing.

The IBM publication in Nature initiated a series of experiments by major scientific institutions such as Argonne National Laboratory, the universities of Tokyo, Washington and Cologne, plus many others. Tests have also already been successfully conducted on the new Heron processor, which IBM is already available for use by research departments via the cloud. Three of these processors have been integrated in the new modular quantum supercomputer, System Two, which is physically located in the IBM research center in Yorktown Heights, New York. New processors will be added over the next year.

The near future

With Quantum System Two, IBM is also approaching its goal marketing on a larger scale of quantum computing solutions: Based on this year’s innovations and discoveries and the presentation of the new modular supercomputer, the company has also revised and updated the roadmap for the development of increasingly powerful quantum computing processors and systems.

The prediction is to create modular systems of increasing complexity and equipped with an ever-increasing number of qubits based on the error mitigation technique until 2027, with the development of the next generation Flamingo processor. From 2029 (again according to forecasts) it will be possible develop even more powerful and precise systems thanks to 200 qubit processors based on error correction, rather than mitigation systems.

See also  Grab the Amazing Apple Watch Series 8 at an Unbeatable Price on Amazon

Colloqio What will come after smartphones, according to Xiaomi by Andrea Nepori 08 October 2023

The state of the art

IBM is among the most active companies in investments and in research in the field of quantum computing. Among the main competitors is Google, which a few months ago published a study similar to that of IBM, in which it demonstrates a similar result. Using Google’s quantum supercomputer (based on second-generation Sycamore processors) and noise and error reduction techniques, the researchers were able to demonstrate the almost instantaneous development of a problem that a classical supercomputer would (theoretically) take decades to solve. The underlying issue, however, is that that problem, in addition to demonstrating the system’s capability, had no practical application. The IBM study published by Nature also suffered from a similar problem, and some researchers have even refuted the quantum advantage demonstrated by the research with new studies (here and here) where the same problems are solved faster with traditional supercomputers.

The IBM studios e Google they are an interesting litmus test of the state of the art of quantum computing: the promises are there, the potential too. But at present it is still difficult to demonstrate that quantum computing can offer an advantage over classical supercomputers in the short term. From a theoretical point of view, no one doubts the magnificent and progressive future of quantum computing, but the field is still divided between optimists and pessimists, as often happens with all early technologies. On the one hand there are those who say that the theory will never see a practical application, on the other those (such as Google and IBM) are instead trying to make the promise of quantum utility come true. The next 10 years will be a fundamental period to understand who is right.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Privacy & Cookies Policy