NVIDIA details its AI-based ACE technology for next-generation NPCs

Artificial intelligence is now everywhereFor better and for worse. NVIDIA has obviously not missed out on AI and its improvements to combine them with its own technologies, it notably has L’Avatar Cloud Engine (ACE) to design NPCs more realistic than everalready used by Ubisoft internally with the NEO NPCs.

Today, NVIDIA take advantage of this lull after the Game Developers Conference and his CTG 2024 to take stock of its use of artificial intelligence. A complicated subject, but fascinating and essential.

Digital characters level up

Non-playable characters often play a crucial role in the narrative of video games, but because they are usually designed for a specific purpose, they can become repetitive and boring, especially in large worlds where there are thousands of them.

Thanks in part to incredible advances in visual computing, like ray tracing and DLSS, video games are more immersive and realistic than ever, making encounters with NPCs particularly dry and unsettling.

At the start of the year, production microservices of the NVIDIA Avatar Cloud Engine have been released, giving game developers and digital creators an ace up their sleeve when it comes to creating realistic NPCs. ACE microservices enable developers to integrate cutting-edge generative AI models into the digital avatars of games and applications. Using ACE microservices, NPCs can dynamically interact and converse with players in-game and in real-time.

Leading game developers, studios, and startups are already integrating ACE into their titles, bringing new levels of personality and engagement to NPCs and digital humans.

Bring your avatars to life with NVIDIA ACE

The process of creating NPCs begins by providing them with a story and purpose, which helps guide the narrative and ensure relevant contextual dialogue. Then, the ACE subcomponents work together to develop the avatar’s interactivity and improve its responsiveness.

NPCs use up to four AI models to hear, process, generate dialogue, and respond.

The player’s voice is transmitted first at NVIDIA Rivaa technology that builds fully customizable real-time conversational AI pipelines and transforms chatbots into expressive assistants through GPU-accelerated multilingual speech and translation microservices.

With ACE, Riva’s Automatic Speech Recognition (ASR) feature processes what was said and uses AI to provide a highly accurate transcription in real time. Discover a speech to text conversion demonstration in a dozen languages, produced by Riva.

The transcription is then transferred into an LLM – such as Google’s Gemma, Meta’s Llama 2 or Mistral – and uses Riva’s neural machine translation to generate a natural language text response. Then, Riva’s text-to-speech feature generates an audio response.

Finally, NVIDIA Audio2Face (A2F) generates facial expressions that can be synchronized with dialogue in many languages. Thanks to this microservice, digital avatars can display dynamic and realistic emotions, broadcast live or integrated during post-processing.

The AI ​​network automatically animates facial, eye, mouth, tongue and head movements based on the selected emotion range and intensity level. And A2F can automatically infer emotion directly from an audio clip.

Each step takes place in real time to ensure the fluidity of the dialogue between the player and the character. The tools are customizable, allowing developers to create the types of characters they need for immersive story or world-building.

Born to Roll

At GDC and GTC, developers and platform partners presented demos leveraging NVIDIA ACE microservices, from interactive in-game NPCs to powerful digital nurses.

Ubisoft explores new types of interactive games with dynamic NPCs. NEO NPCs, from its latest research and development project, are designed to interact in real time with players, their environment and other characters, opening up new possibilities for dynamic and emergent storytelling.

The abilities of these NEO NPCs were showcased through demonstrations, each focused on different aspects of NPC behavior, including awareness of environment and context, real-time reactions and animations, conversation memory, collaboration and strategic decision-making. These demonstrations highlighted the technology’s potential to push the boundaries of game design and immersion.

Using Inworld’s AI technology, the Ubisoft narrative team created two NEO NPCs, Bloom and Iron, each with their own story, knowledge base, and unique conversation style. Inworld’s technology also provided NEO NPCs with intrinsic knowledge of their environment, as well as interactive responses powered by Inworld’s LLM. NVIDIA A2F provided facial animations and lip sync for both NPCs in real time.

Inworld and NVIDIA got GDC buzzing with a new tech demo called Covert Protocol, which showcased NVIDIA ACE technologies and the Inworld Engine. In this demo, players controlled a private detective who completed objectives based on the results of conversations with NPCs on site. Covert Protocol unlocks social simulation game mechanics with AI-enabled digital characters that convey crucial information, present challenges, and catalyze key narrative developments. This increased level of AI-driven interactivity and player action should open up new possibilities for emergent, player-specific gameplay.

Built on Unreal Engine 5, Covert Protocol uses Inworld Engine and NVIDIA ACE, including NVIDIA Riva ASR and A2F, to augment Inworld’s speech and animation pipelines.

In the latest version of the tech demo NVIDIA Kairos Made in collaboration with Convai and presented at CES, Riva ASR and A2F were used to significantly improve NPC interactivity. Convai’s new framework allowed NPCs to converse with each other and gave them item awareness, allowing them to pick up and deliver items to desired locations. Additionally, NPCs have gained the ability to guide players to objectives and traverse worlds.

Digital characters in the real world

The technology used to create NPCs is also used to animate avatars and digital humans. Beyond gaming, task-specific generative AI extends to healthcare, customer service, and many other areas.

NVIDIA collaborated with Hippocratic AI at GTC to expand its healthcare worker solution, showing the potential of a healthcare worker avatar with generative AI. Further work is underway to develop an ultra-low latency inference platform to power real-time use cases.

“Our digital assistants provide useful, timely and accurate information to patients around the world,” said Munjal Shah, co-founder and CEO of Hippocratic AI. “NVIDIA ACE technologies bring them to life with cutting-edge visuals and realistic animations that help better connect with patients.”

Internal testing of Hippocratic’s first AI health workers focuses on chronic care management, wellness coaching, health risk assessment, social determinants of health surveys, pre-operative awareness and follow-up after hospital discharge.

UneeQ is an autonomous digital human platform focused on AI-powered avatars for customer service and interactive applications. UneeQ integrated the NVIDIA A2F microservice into its platform and combined it with its Synanim ML synthetic animation technology to create highly realistic avatars to improve customer experience and engagement.

“UneeQ combines NVIDIA animation AI with our own Synanim ML synthetic animation technology to deliver real-time digital human interactions that respond to emotions and deliver dynamic experiences powered by conversational AI,” said Danny Tomsett, founder and CEO of UneeQ.

AI in games

ACE is one of several AI technologies from NVIDIA that take gaming to the next level.

  • THE NVIDIA DLSS is a revolutionary graphics technology that uses AI to increase frames per second and improve image quality on GeForce RTX GPUs.
  • NVIDIA RTX Remix allows modders to easily capture game assets, automatically enhance materials with generative AI tools, and quickly create stunning RTX remasters with full ray tracing and DLSS.
  • NVIDIA Freestyle, accessible via the new NVIDIA beta appallows users to customize the visual aesthetic of over 1,200 games through real-time post-processing filters, with features like RTX HDR, RTX Dynamic Vibrance and more.
  • The app NVIDIA Broadcast turns any room into a home studio, equipping the livestream with AI-enhanced voice and video tools, including noise and echo cancellation, virtual background and AI green screen, auto-frame, video noise removal and eye contact.

Artificial intelligence continues to evolve, its implementation in video games is already very real and it will continue to grow in the years to come. You can find graphics cards NVIDIA GeForce RTX on Amazon, Cdiscount And there Fnac.

source site-121