Ubisoft’s Neo NPC Prototype Is a Glimpse at the Future of How We’ll Interact With Games - IGN
Today Assassin's Creed maker Ubisoft unveiled what it’s calling 'Neo NPCs', a prototype for player-facing generative AI that essentially makes it so you can talk – literally speak with your voice – to NPCs in a video game.
www.ign.com
Today Assassin's Creed maker Ubisoft unveiled what it’s calling 'Neo NPCs', a prototype for player-facing generative AI that essentially makes it so you can talk – literally speak with your voice – to NPCs in a video game.
At the Game Developers Conference in San Francisco, I got to play through a short proof-of-concept demo putting this tech into action – and it feels like a glimpse at the next great frontier of innovation in video game development.
In the demo, I stepped into the shoes of a new recruit to a resistance group that was planning an operation to “take down the mega corps.” I started out chatting with a character named Bloom, learning some info about the group’s activities and finding out some personal details about a few group members. Fairly standard video game fare, to be honest – but where this type of conversation would typically be driven by a branching dialog tree, here it took place as a verbal conversation between the NPC and myself. Speaking into a microphone, I asked Bloom things like how long he’d been in the resistance, was he worried about people getting hurt, or if he had a crush on any other members. Bloom’s character would answer each question in turn, driving the conversation forward while also filling a little relationship meter that would unlock more personal answers once we reached level two.
Later in the demo, Bloom and I watched a video feed of a drone surveilling guards, showing off the tech’s ability to react to things that are happening in-game. In this case, I could ask Bloom how many guards had been spotted so far, or what was happening when the drone started to go out of signal range. Finally, while planning a mission, I chatted with a different NPC who was set on a certain plan, but changed her mind after I presented her with an alternate course of action. We discussed various options for entering a second-story balcony, the best way to eliminate the guards, and how to deal with the security cameras. It felt like being set in the middle of a heist movie as the crew discussed options and weighed which path to take before setting their plans in motion.
Neo NPCs are built using tech from Nvidia and Inworld AI, allowing Ubisoft’s developers to craft a narrative experience with AI running under the hood. The tech itself is essentially a stack of several different AI implementations: a speech-to-text analyzer (turning your spoken words into a text input); Inworld AI’s Character Engine, a large-language model that determines how the NPC will act and respond; a text-to-speech program that then takes that response and generates a verbal response; and finally Nvidia’s Audio2Face technology, a real-time facial animation system that makes the NPC’s lips and mannerisms match the words that are being spoken aloud.
The result is something that feels almost like playing a tabletop role-playing game, having a discussion with your DM to drive the conversation and story forward together.
Nvidia showed off a similar, albeit less ambitious, demo at CES earlier this year where you chat with an NPC in a cyberpunky ramen shop, and another at GTC yesterday where you play as a detective trying to gain information about a biotech CEO from his colleague in a hotel lobby. Of the three demos I’ve experienced, Neo NPCs feel the most fully formed. This makes sense – neither Nvidia nor Inworld AI claim to be game developers, so their demos feel significantly more like proofs of concept, whereas Ubisoft’s version leverages the company’s narrative and game design prowess.
Of course, Neo NPCs are still just a prototype tech demo, so full game experiences designed around this kind of generative AI are still a long ways off. Still, seeing the tech in action has me excited for the future of gaming in a way that I can’t say I’ve felt in quite a while. Done right, this is the kind of tech that, much in the same way that the jump from 2D to 3D, or from closed to open game worlds, could lead to entirely new kinds of games and experiences.
Last edited: