How is a quantum computer built?


What does it take to make a quantum computer? Various ingredients, very delicate to handle, explains in The Conversation Aymeric Delteil, researcher in experimental quantum optics.

The 2022 Nobel Prize in Physics has just been awarded to three pioneers in quantum information, including a Frenchman, Alain Aspect. Their work laid the foundations of the “second quantum revolution”, which makes it possible to dream of the realization of a quantum computer.

In fact, the quantum computer is making more and more noticeable appearances in the general press, and many readers could deduce from this that humanity already has overpowered instruments capable of beating our good old computers.

Indeed, based on the fundamental principles of quantum physics, researchers and industrialists are combining their efforts to create the ultimate, so-called “universal” quantum computer. And while it is true that recent achievements in the field are as impressive as they are promising, with the largest quantum computer announced to date containing 127 “quantum bits”, it must be understood that the latter was not designed to be able to perform useful calculations, but as a “proof of concept”.

Thus, at present, a functioning universal quantum computer remains an inaccessible Holy Grail, the success of which no one can yet predict with certainty.

On the other hand, we are already on the verge of having smaller quantum machines, called “quantum simulators”, which will be useful for solving specific problems in physics, engineering, chemistry or even pharmaceuticals.

What would a universal quantum computer allow?

A “universal” quantum computer could in theory perform all the calculations that a classical computer can, but more efficiently. In reality, this will only be possible for certain calculations, which will implement algorithms specifically designed for quantum computing. The quantum computer would then make possible calculations that are currently infeasible because they would take too long.

For example, thousands of billions of years would be needed on a state-of-the-art supercomputer to factor numbers to a few hundred digits, such as those used to secure our communications, but it would take only a few hours for a modestly sized quantum computer to solve this issue.

A state-of-the-art supercomputer would take far too long to factorize numbers to a few hundred digits. // Source: Canva

Building a quantum computer, the big challenge

It is still necessary to build a quantum computer that works with the quantum algorithms designed expressly for it.

A quantum computer is a machine which incorporates “qubits” – the quantum equivalent of “bits”, units of classical calculation – and which allows them to be manipulated in order to carry out the operations required by the algorithm. Thus, qubits must follow the laws of quantum physics.

The first candidates for the role of qubit are therefore individual quantum particles. Indeed, we now know how to control individual atoms with lasers, ions with electromagnetic fields, and electrons with nanometric electrical circuits.

We can also use particles of light, because today we know how to emit photons one by one – so-called “single” photons. For this, we use “semiconductor quantum dots”, controlled defects for example in diamond, or even so-called “nonlinear” crystals.

A final option is to use superconducting circuits as qubits: these are millimeter-sized electronic circuits, much larger than the quantum particle-based realizations discussed above. They thus offer the advantage of being able to be integrated on a chip using techniques similar to the manufacture of conventional computers. This is the path chosen by the computer giants Google and IBM.

What is the “second quantum revolution”?

The multitude of physical systems that can materialize the quantum computer results in abundant research in all these fields, which are progressing in parallel. This control of individual particles constitutes what is now called the “second quantum revolution” – the first being that of technologies based on sets of quantum particles: lasers, transistors, superconductors, which have permanently transformed our world during the second half of the XXe century.

Despite all this progress, the control of these qubits remains much more difficult than classical bits, because of the great fragility of quantum effects. Indeed, the environment of the constituents (that is to say the heat, the light, the electric and magnetic fields for example) always ends up disturbing the states of superposition and entanglement. This phenomenon, called “decoherence”, limits in practice the manipulation and storage of the information contained in the qubits.

Worse… the more qubits the computer has, the faster this disturbance. For these reasons, current prototypes are made in extreme environments: very low temperature (a few fractions of degrees from absolute zero), ultra-high vacuum, total darkness. These very restrictive conditions, only achievable in the laboratory, make it possible to preserve the quantum effects for a certain time, but not long enough to be really operational.

A theoretical solution exists: to protect quantum information, each “logical” qubit must be distributed over a very large number of “physical” qubits in order to be able to perform error corrections. In this approach, millions of qubits would thus be necessary for a reliable universal quantum computer and it would be necessary to manufacture oversized quantum computers to compensate for their imperfections.

However, at present, no one is able to make such a computer, and it is impossible to predict if it will exist one day.

“Quantum simulators”, specialized computers

Yet, in the quest for the universal quantum computer, new ideas have emerged.

It seems very likely that the first quantum computers will be very different from the omnipotent computers imagined at the start and will rather be designed to perform a very specific task, constituting undoubtedly the most efficient way to achieve it – if not the only one.

The applications are multiple, with for example the optimization of distribution networks, the understanding of the mechanisms of photosynthesis, the design of catalysts for fertilizers, drugs, the optimization of batteries or solar cells.

To illustrate this principle with a concrete example, an Australian team this year produced a small processor based on around ten silicon qubits, by trapping electrons between nanometric electrodes. This quantum circuit was designed to simulate molecules of polyacetylene, a molecule which is of great interest for fundamental physics because it materializes a complex theoretical physics problem, and which was the subject of the Nobel Prize in Chemistry in 2000.

In this virtuoso achievement, the size of the simulated molecule (10 atoms) is at the limit of what is reasonably calculable with a conventional computer. This made it possible to verify the predictions of the quantum processor.

A quantum processor twice as big, that is to say with 20 qubits, would already far surpass what our classical computing capacities could simulate concerning this type of remarkable molecules.

For further

Quantum processor // Source: SQC

Increase the number of qubits

To achieve such a “quantum advantage”, the number of qubits in the computers must be increased.

The last ten to twenty years have seen the realization of many prototypes of quantum protocols with two or a few qubits – often spectacular. The most advanced systems can currently process a few dozen qubits.

In addition to the problems of “coherence” already mentioned, the difficulty also lies in the reproducibility of physical systems.

Let us take this time the example of photons. Quantum algorithms require that these light particles are all “indiscernible”, that is to say that they have the same characteristics, and in particular exactly the same wavelength (the same color). The best sources for this are “quantum dots”, because the photons emitted consecutively by the best quantum dots are all identical. But as the photons emitted by distinct quantum dots are generally very different. To perform a quantum calculation, we are therefore forced to use one and the same quantum box, and to use the photons that it has emitted consecutively. This constraint complicates the architecture of a future computer and limits the total number of qubits that can be used simultaneously.

Research is currently being carried out by many laboratories to obtain controlled sources which all have the same wavelength, which will make it possible to obtain large numbers of identical photons emitted simultaneously and to maintain a high clock frequency.

Whatever the physical platform chosen to create qubits, it will be necessary to be able to manipulate and measure them in large numbers, but independently of each other. And to interface these quantum systems with classical controllers requires miniaturized systems, often nanoscale.

Thus, the scaling up of quantum computing promises to give a hard time to engineers in electronics, optics, or even computer science… Even if it is impossible to predict whether we will have a true “universal” quantum computer », quantum computing will undoubtedly be part of our future and will structure the sciences in the long term.The Conversation

The conversation logo

Aymeric Delteil, CNRS Researcher, Condensed Matter Study Group, University of Versailles Saint-Quentin-en-Yvelines (UVSQ) – Paris-Saclay University

This article is republished from The Conversation under a Creative Commons license. Read the original article.



Source link -100