ChatGPT: Google (DeepMind) counterattacks with its AI Sparrow


Google will soon enter the chatbot game. DeepMind, a company owned by Alphabet (Google), will soon launch its chatbot, announced its CEO, Demis Hassabis, during an interview with the American magazine Time.

A “private beta” in 2023

Like the control of electricity in the past, the arrival of artificial intelligence will upset our daily lives, according to Demis Hassabis. In order to carry out this revolution, the head of DeepMind intends to take his time to avoid releasing an AI that is dangerous for humanity.

Driven by competitive pressures and in particular the ChatGPT phenomenon, the Google subsidiary could be forced to accelerate its schedule. Thus, DeepMind now plans to release a first version “private beta” of its in-house chatbot named Sparrow in the year 2023. The company’s engineers would still work to perfect the chatbot by reinforcement learning (reinforcement learning).

A chatbot focused on ethics?

Unlike ChatGPT, which is able to generate false information without mentioning its origin, Sparrow could develop sourced answers. DeepMind AI “reduces the risk of dangerous and inappropriate responses”said the company in a press release, in September 2022. The chatbot would be able to speak naturally with a user, answer their questions and could even search the Web, with Google.

More than a classic conversational agent, Sparrow would have been developed with the very aim of generating constructive and factual answers. To ensure that the latter does not produce any threatening, hateful or insulting phrases, DeepMind engineers would have set a set of strict rules to be respected in order to generate a response. “These rules are the result of the study of existing work on the harms of language and the consultation of experts”explains the company.

Last September, the first results were quite correct in terms of ethics, but they could still be improved. To develop more precise and comprehensive rules, “the contribution of experts on many topics (including policy makers, social scientists and ethicists)” is necessary, the developers believed. A phase which then had to be completed by “the participation of a wide range of users and affected groups”. This last step could thus coincide with the implementation of “private beta” during the year, mentioned by Demis Hassabis.



Source link -98