How AI and quantum computing could shake up our world


Camille Coirault

January 7, 2024 at 9:48 a.m.

12

Quantum (people in reflection) © © Gorodenkoff / Shutterstock

The alliance between AI and quantum computing: towards an unprecedented advance in data processing? © Gorodenkoff / Shutterstock

To imagine machine learning enabled by AI and quantum computing merging is to see two of the most advanced scientific fields joining forces. A promise to solve once intractable problems, but also to transform our understanding of technology.

Although quantum computing has made promising progress in recent years, we have not truly entered the quantum era. Certainly, OVH already has its own quantum computer and Google is rushing to work on encrypting its own. We are even aware of the cybersecurity risks that these machines endowed with phenomenal computing power can represent. However, there is still a (very) long way to go before we can say that quantum computing is part of our daily lives.

Since the emergence of AI systems, a new avenue of research has opened up: that of the exploration of quantum machine learning. Private players like IBM and Google are already fully invested in it, as well as start-ups like IonQ or Rigetti. The academic community is not lagging behind either: CERN (European Organization for Nuclear Research) is also carrying out its own research in the field. So what are the ambitions of this technological convergence?

Quantum machine learning: a game changer ?

Why are we considering introducing quantum computing in this specific area? Sofia Vallecorsa, researcher at CERN, explains this perspective: “ Our idea is to use quantum computers to accelerate or improve classical machine learning models “. However, nothing is decided yet. Scholars have yet to establish fundamental proof that quantum machine learning has the advantage over its classical counterpart.

We already know that in certain specialized tasks, quantum computers are extremely efficient compared to classical computers. Here is a small, non-exhaustive list:

  • Search in large data sets: Grover’s algorithm (explanation in the source from Microsoft at the bottom of the article) allows, for example, to retrieve information very quickly from gigantic databases.
  • Quantum cryptography: use of quantum principles to guarantee the security of communications.
  • Factoring large numbers: a quantum computer allows factoring much faster than classical methods.
  • Simulation of quantum systems: Similarly, a quantum computer has the ability to simulate quantum interactions much more efficiently than a classical computer.

In theory, therefore, quantum computers have immense potential to accelerate the process of machine learning. But this has not yet been clearly and convincingly demonstrated.

Google Quantum Computer © ©Google Blog

Photo of one of Google’s quantum computers © Google Blog

Current realities and limits

Practical application is still a fairly long road strewn with obstacles that will have to be overcome. Let’s take the example of Ewin Tang’s algorithm. This student was able to demonstrate that an inspired classical algorithm could compete with a quantum machine learning algorithm in terms of speed. This is a perfect example that illustrates the complexity of achieving practical, concrete and constant advances in the quantum domain. To find out more about this algorithm, the link to the site The research in the sources dwells more on the subject.

Scott Aaronson is a quantum computing researcher at the University of Texas and supervised young Tang. He comments on this specific aspect: “ the latter’s findings have made the goal of exponential quantum speedup for practical machine learning problems even more difficult to achieve than before “. While the operations performed by a quantum computer are certainly much faster, the first initialization and reading steps in quantum applications are often quite slow.

Ultimately, the time saved in calculation would not be significant enough for the balance to tip positively. In addition to this, we must add to this observation the nature probabilistic of quantum physics (the results are predictions considered as probabilities and not certainties absolute), which generally requires each operation to be repeated several times. An aspect not to be overlooked, which could play a role in the overall inefficiency of calculations applied to machine learning.

Towards new horizons

Complex does not mean impossible. At Google, Hsin-Yuan Huang’s experience with the Sycamore computer remains a very concrete example of this potential. The latter was used by Huang and his team to simulate the behavior of complex materials and analyze all the data obtained using quantum machine learning. This was apparently a success, as they concluded that this method worked, and the information was processed much more quickly.

It’s a real Pandora’s box that opens for certain fields, such as astronomy or particle physics. These analyzes from the perspective of quantum physics could open our eyes to completely unknown aspects of our universe.

Even more intriguing: this technology could allow scientists to determine certain material characteristics or states that can only be suggested indirectly with traditional methods. Example: we could precisely identify whether a material is superconductive, that is to say it can conduct electricity without resistance. This process, in classical laboratories, is only deduced indirectly. Particular properties like these could be understood more directly through quantum machine learning.

We have arrived at a rather exciting stage of quantum research. As Aram Harrow of MIT puts it: “ Whether or not quantum computers offer advantages for machine learning will be decided by experimentation rather than mathematical proofs of their superiority “. Maria Schuld, a physicist working in South Africa for the quantum computing company Xanadu, explains that it is important to continue research without necessarily focusing on increasing the speed of calculation. Other benefits could emerge as our knowledge of machine learning increases: innovative new methodologies or novel applications for example.

Sources: Nature, Microsoft, The research, Data Analytics Post



Source link -99