Highlights

  • Nvidia's generative AI creates dynamic NPCs for games, adapting them based on player actions for unique experiences.
  • Real-world uses of Nvidia's technology include digital healthcare agents and customer service avatars.
  • Ubisoft and other game developers are leveraging Nvidia's generative AI technologies for immersive gameplay.

Recently, Nvidia showcased its generative AI-based digital human technologies that can create dynamic, lifelike characters for use in games and other applications. With help from participating studios, Nvidia showed demos that leveraged these AI technologies to create immersive interactions with computer-generated avatars.

Digital human technologies is an umbrella term that includes Nvidia’s Avatar Cloud Engine (ACE), NeMo, and RTX tech. Among them, ACE facilitates speech recognition and animation tasks, NeMo revolves around natural language processing, and RTX technology is what enables devs to render high-fidelity models. Over the past few years, Nvidia has continued working on these technologies and sometimes shown them in action at events.

Related
Two Years Later, Every Game The Nvidia GeForce Now Leak Got Right and Wrong (So Far)

The Nvidia GeForce Now leak was one of the most substantial leaks in gaming history, and several of the revealed titles have since released.

At GDC 2024, an Unreal Engine 5-based demo from Inworld AI and NvidiaCovert Protocol showcased modular NPCs that adapt themselves with respect to the player’s actions – making sure each player experiences a unique playthrough. This demo uses Nvidia’s microservices to detect player voice input and animate NPC dialogues. Additionally, this dynamic NPC behavior is influenced by a character’s personality, emotional state, and the context of the interaction with the player.

Besides game demos, other possible real-world uses of Nvidia’s digital human technologies were shown at another recent event, GTC 2024. First off, Hippocratic AI revealed a look at a digital healthcare agent designed for handling specific tasks, such as care coordination and post-discharge management. It relies on Nvidia’s ACE, Audio2Face, Omniverse Streamer Client, and more technologies for the same. On top of that, UneeQ’s collaborative work on a digital avatar for customer service needs, which uses Nvidia’s tech and Synanim ML, also took the stage with a demo.

Games Will Soon Use Nvidia’s Generative AI Technologies

Among the big names, Ubisoft games may feature generative AI-fueled NPCs in the future. It shared three tech demos of NEO NPCs, its own variation of dynamic characters built using Inworld AI and Nvidia technologies. Bloom and Iron, the NEO NPCs reflected their conversation memory, contextual awareness, methodical decision-making, among other things in these demos.

Apart from that, there are announced games that will leverage these generative AI technologies to engage players. World of Jade Dynasty from Perfect World Games is set to use Nvidia’s Audio2Face technology to produce accurate lip-syncing for new languages. Adding to the list, this technology will also be used to create accurate facial animations for Unawake by RealityArts Studio and Toplitz Productions. As Nvidia's generative AI technologies pick up pace, more titles could adopt the same. It will be interesting to see to which degree the development pipeline will be optimized with these changes taking place.