Machine Learning – The technology behind ChatGPT can do more than language – News


Contents

In 2017, researchers at Google developed the Transformer, a mechanism that helped large language models such as ChatGPT achieve a breakthrough. It soon became clear: Transformers can do more than just chat and translate.

The chemist and computer scientist Alain Vaucher from the IBM Lab in Rüschlikon draws a formula for a new chemical compound on a laptop. At the push of a button, an AI calculates the necessary ingredients, and another AI-based software calculates the recipe.

Finally, a robot executes the AI-generated instruction in a lab and creates a new chemical compound. RoboRXN is the name of this system, which is being developed in the IBM Lab.

Transformers determine the origin of gems

Sapphires are among the most valuable substances found in nature. Several million dollars are regularly paid at auction for special specimens – per gram.

The condition and origin of a stone played a central role in the evaluation, explains Daniel Nyfeler, Managing Director at the Gübelin Gem Lab: “A great sapphire from Kashmir costs maybe ten million dollars. A stone of the same quality from Madagascar fetches maybe a million.”

Gübelin specialists analyze gemstones in laboratories in Lucerne, Hong Kong or New York and then create a report for the seller. It is crucial that all three laboratories come to the same result during the analysis.

To ensure that an analysis always produces the same result, regardless of the expert or laboratory, Gübelin commissioned the Center suisse d’électronique et de microtechnique (CSEM) in Alpnach with an AI solution. The software makes reliable statements about the origin and condition of a gemstone.

The task that RoboRXN solves is comparable to a translation from one language to another.

In both projects, a transformer works in the background, i.e. AI software that was actually developed for speech processing. Instead of stubbornly processing words one after the other and analyzing the last and the next word, Transformer can place terms in a larger context.

This increased attention span helped the language models to break through. However, it soon became apparent that this trick was not only successful in language applications, but also in completely different areas, such as chemistry.

The task that RoboRXN solves is comparable to a translation from one language into another, says Alain Vaucher. Instead of English to German, the software converts the characters of a chemical compound into a different sequence.

Provide insight into thinking

In the Gem Lab, too, Transformers play an important role in evaluating the trace elements found in a gemstone. This is possible because transformers are so flexible and can handle a wide variety of data, not just language, says Tommaso Bendinelli from CSEM. Transformers are also used in the analysis of image content, for example in the interpretation of X-ray images.

If a mechanism is as suitable for language processing as it is for the analysis of chemical compounds, then interesting questions arise: Is human language more than just a means of communication? Does language play a bigger role in our thinking than previously assumed?

In the search for answers, cognitive research is closely following developments in the field of artificial intelligence in the hope that the machines created by humans will in turn provide insights into the thinking of their creators.

source site-72