It’s the latest craze in the tech industry and it’s in full swing everywhere. Artificial intelligence is all the rage today, whether with solutions that speak to us, create images and many other things. Nvidia just presented Cloud Avatar Engine (ACE) a solution that integrates artificial intelligence into non-playable characters (NPCs) of video games.
One of the big proponents of this emerging technology is NVIDIA. Despite a lot of talk about artificial intelligence, it’s really been with us for “four days”. Its explosion was brutal and all the developers are rushing to integrate it everywhere and bring new features.
It seems that this technology will soon reach video games, integrating with these characters that you meet in all video games and which bring little or nothing.
“Smart” NPCs thanks to NVIDIA
So far those characters you can’t handle have a recorded message and leave. You can ask them all the time you want, they send you their bland, flat message, and go. Something that could change shortly and open up a whole new world of possibilities.
There NVIDIA ACE Technology intelligence. This tool designed for software and game developers allows you to create and to integrate model of voice with artificial intelligence. In addition, we will be able to start conversations with those characters who, until now, have not been far from the trees.
This solution was made possible through a partnership with Convai. This company is working on the development of a conversational AI for online games.
Th is model is based on three elements:
- NVIDIA NeMo: lets you create, customize, and implement language models using proprietary data. They can be customized with character backstory and protect against negative or dangerous conversations using NeMo Guardrails.
- Nvidia Riva: automatic voice recognition and text-to-speech to integrate live voice chat
- NVIDIA Omniverse Audio2Face: automatically creates an expressive facial animation of a character to match the voice track. Connects Autio2Face with Omniverse for Unreal Engine 5, which allows developers to add facial animations to Meta Human characters
A demonstration of an AI-generated conversation with an NPC was presented by the company. The character answers story-based questions using generative AI.
This NVIDIA tool represents another innovation from the company for the video game industry. Now all that remains is to start integrating it into games.
Which games will integrate it?
It would seem that the first game to be officially integrated is STALKER 2 Heart of Chernobyl. This game would be the first major title to use Audio2Face. Additionally, Fort Soils, a game from indie studio Fallen Leaf, will also use this technology.
It’s not entirely clear whether this technology works entirely in the cloud or on-premises. Note that NVIDIA graphics cards have Tensor Cores, specific cores for AI. We imagine that, if run locally, it won’t have as big of an impact as Ray Tracing.
We’ll see if AMD has an alternative to this technology. It should be noted that they do not have specific cores (at least for now) for artificial intelligence.