Video games: AI automates dialogues and animations - Hello Future Orange

AI in video game design

• Some video game designers have been using artificial intelligence, especially for generating texts and interactions of non-player characters (NPCs).
• These tools automatically adapt the audio and written content of NPCs based on the players’ level and interactions, saving time for the designer.
• It is also possible to automate the animations of these NPCs and animate their facial expressions according to their dialogue and emotions.

The video game industry is increasingly ambitious in its demand for games with more environments, more non-player characters or NPCs, and ever more gameplay mechanics. There is now artificial intelligence tools to help development teams respond to these challenges, which can automatically generate text, voices and even animation for NPCs. But how do they work in practice? Oriane Piedevache–Opsomer, co-founder of X&Immersion, explains.

Dialogue and descriptions of the fictional world and its characters, along with general information on the game, are used to train an AI model

Text automation

Let’s take the example of dialogue text generators. These tools can provide varied content and a more personalized game experience. For example, you arrive in a village and enter a tavern. If you are level one and have no money, the innkeeper will tell you to leave, but if you are level 20 and a recognized archimage, he will be delighted to welcome you and may even offer you a drink. How does it work? Dialogue and descriptions of the fictional world and its characters, along with general information on the game, are used to train an AI model. Using this model, we can generate dialogues for characters, which are coherent with their world, and with their personalities and emotions. The result is varied content that enables players to influence NPCs’ behaviour.

Voice Synthesis

It is possible to make the innkeeper speak using all the possibilities of AI dialogue with vocal synthesis technology. This will enable us to generate artificial voices for NPCs. How does it work? Someone will act out dialogue as if they were in a theatre, and we will then use this recording to teach the intonation and other features of the character’s voice to the model.

Automated Animations

Now, let’s talk about animation. It’s vital to the credibility of dialogues, because it shows how they affect NPCs. Depending on what you say to him, the innkeeper’s expression can go from very happy to furious. Traditionally, all of this work has to be done by animators who repeat some tasks for all of the NPCs, which makes it very slow and tedious. Fortunately, we now have tools to automate some of this work. We can take audio or text and use it to automatically generate mouth movements and facial expressions, enabling animators to work much faster when they are producing a game.

Read also on Hello Future

Udio, Suno: AI-generated music is already competing for low added value orders

Discover
GettyImages - livre blanc univers immersifs - white paper immersive universe metaverse

Immersive Technologies White Paper: Exploring the Responsibility of Metaverses

Discover
Deux personnes se tiennent la main entourées de lumières bleues

Words of innovation 3.0: tech experts explain!

Discover

A journey into the metaverse: marketing opportunities in a connected and persistent world

Discover