Gaming News Artificial intelligence in video games has just taken a leap forward with this video at CES 2024! What is the future for gamers?


Game news Artificial intelligence in video games has just taken a leap forward with this video at CES 2024! What is the future for gamers?

Share :


Nvidia’s artificial intelligence project, named ACE, is revealed a little more through a technical demonstration during CES 2024.

CES 2024 is an opportunity for everyone to come and present their technological prowess and artificial intelligence is no exception.. The manufacturer Nvidia decided to carry out, during the show, a technical demo of ACE (Avatar Cloud Engine), its AI which makes it possible to improve the realism of NPCs in video games. Through this tool, it is possible to chat more or less naturally with characters who are not controlled by players via a microphone. It’s not really a surprise when we know that at the moment, artificial intelligence is becoming more and more popular with the general public.

The Nvidia tool allows you to create more realistic NPCs. During the demonstration, the public was able to speak to Nova and a ramen restaurant chef. After the question, it takes a few seconds to receive an answer. Each character can be configured to speak several languages, such as English or Spanish, and it is the developers who provide the data, generally quite current, to integrate them into the dialogues. NPCs can also feel different moods which can be guessed through the expression of their faces and the way they speak. Note sometimes some slight bugs. A breakthrough that heralds a major addition in the future for video games.

A worrying progress?

During this JV Fast, the editorial team wondered how this technology would be used. Nvidia has already announced partnerships with certain major studios such as Ubisoft, miHoYo (which takes care of Genshin Impact and Honkai: Star Rail), NetEase Games, Inworld and the Chinese giant Tencent. ACE seems full of promise, but it seems complicated at the moment to replace the expertise of real professionals.

Even though studios have had access to it for a few months now, it seems like it’s a real headache getting used to this new technology. The interest is to combine human expertise with AI and train employees so that they can use it more easily and naturally. Perhaps it is also up to programmers to set limits to their own technologies, when we know that Nvidia’s artificial intelligence can itself generate animation and lip synchronization depending on the sentence spoken. If you are wondering what limits should be imposed on AI so as not to take precedence over human expertise, Panthaa and Anagund give you the beginning of an answer in this JV Fast.



Source link -113