您现在的位置是:Nvidia's AI NPCs offer life >>正文

Nvidia's AI NPCs offer life

上海品茶网 - 夜上海最新论坛社区 - 上海千花论坛97人已围观

简介By subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.Making im...

By subscribing, you agree to our Terms of Use and Policies You may unsubscribe at any time.

Making immersive worlds that feel lived in is one of the big challenges for many modern game developers.

Nvidia's AI NPCs offer life

Games like 'Cyberpunk 2077' and 'Red Dead Redemption 2' advertise themselves as featuring massive immersive open worlds, allowing players to enter another existence and experience the hustle and bustle of towns and cities full of life-like non-player characters (NPCs).

Go and talk to NPCs in many games, and the immersion is quickly broken when you prompt them to talk, and they say the same two or three dialogue options on repeat.

Nvidia has announced a potential solution to this issue at this year's CES with its new Nvidia ACE (Avatar Cloud Engine). According to NVIDIA, ACE will allow NPCs in games to have endless conversations that are never the same, thanks to AI-generated dialogue.

See Also Related
  • CES 2024 Las Vegas News & Announcements | Interesting Engineering  
  • Rise of Nvidia: AI chip supremacy sparks a Silicon Valley power shift 
  • 7 of the best laptops from CES 2024 

Paving the way for incredibly immersive game worlds

The new development from Nvidia will continue bringing video games eerily close to reality – to the point where you can see how some of the world's top thinkers believe we might be living in a simulation.

Nvidia demoed ACE at CES 2024. The company showed off a tech demo of an in-game scene of a cyberpunk ramen bar with two NPCs – a chef called Jin and a customer called Nova. During the demo, the two are shown holding an AI-generated conversation, meaning it isn't scripted and played as the player approaches, as is typically the case in modern video games.

Not only that, but while running the demo, Seth Schneider, Senior Product Manager for ACE, started conversing with the NPCs, and they replied in real-time. Check it out for yourselves below.

For the demo, Nvidia collaborated with Convai, a generative AI NPC-creation platform. That platform allows developers to write a backstory and characteristics for a character and also choose their voice so they can fit a cohesive character type in a game.

The ACE platform, meanwhile, allows players to speak with NPCs by capturing their speech before converting it into text for a large language model (LLM) to process. The same LLM then generates the NPC's responses. An animation model, meanwhile, generates lip movements while the NPC is talking.

What is the future of NPC interactions in video games?

Of course, it's always best to take what we see in a tech demo with a grain of salt. Many triple-A games, including 'Cyberpunk 2077', showed off impressive interactions during early demos only to be marred by technical problems at release.

We know little of the technical requirements of a hypothetical game with this type of AI-generated dialogue. It might be some time before your average gaming PC and console is powerful enough to meet those requirements, meaning most game developers will likely stick with the standard model of NPCs with limited dialogue options for now.

Still, the ACE demo provides a glimpse of what NPC interactions might look like in video games of the future. You might soon be able to walk around a futuristic sci-fi world and strike up conversations with its many inhabitants.

Tags:

相关文章



友情链接