Now You Can Create Lifelike Avatars with AI Animation and Speech in NVIDIA ACE

Share via:

NVIDIA has updated the NVIDIA Avatar Cloud Engine (ACE) with new animation and speech capabilities for AI-powered avatars and digital humans. These enhancements focus on natural conversations and emotional expressions. 

Developers now have access to cloud APIs for automatic speech recognition (ASR), text-to-speech (TTS), neural machine translation (NMT), and Audio2Face (A2F). These tools, available through the early access program, enable creators to build advanced avatar experiences using popular rendering tools like Unreal Engine 5.

The ACE AI animation features now include A2F emotional support and an Animation Graph microservice for body, head, and eye movements. These additions aim to create more expressive digital humans. A new microservice facilitates rendering production and real-time inference, and A2F quality improvements enhance lip sync for realistic digital human representations.

The ACE suite now supports additional languages including Italian, EU Spanish, German, and Mandarin, and has improved ASR technology accuracy. The cloud APIs simplify access to Speech AI features. The new Voice Font microservice allows customisation of TTS outputs, enabling unique voice applications in various scenarios.

ACE Agent, a new dialog management and system integration tool, provides a seamless experience by orchestrating connections between micro-services. Developers can now integrate NVIDIA NeMo Guardrails, NVIDIA SteerLM, and LangChain for more controlled and accurate responses. 

The updates make it easier to use these tools in various rendering and coding environments. New features include support for blendshapes within the Avatar configurator for integration with renderers like Unreal Engine, a new A2F application for Python users, and a reference application for developing virtual assistants in customer service.

The post Now You Can Create Lifelike Avatars with AI Animation and Speech in NVIDIA ACE appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Now You Can Create Lifelike Avatars with AI Animation and Speech in NVIDIA ACE

NVIDIA has updated the NVIDIA Avatar Cloud Engine (ACE) with new animation and speech capabilities for AI-powered avatars and digital humans. These enhancements focus on natural conversations and emotional expressions. 

Developers now have access to cloud APIs for automatic speech recognition (ASR), text-to-speech (TTS), neural machine translation (NMT), and Audio2Face (A2F). These tools, available through the early access program, enable creators to build advanced avatar experiences using popular rendering tools like Unreal Engine 5.

The ACE AI animation features now include A2F emotional support and an Animation Graph microservice for body, head, and eye movements. These additions aim to create more expressive digital humans. A new microservice facilitates rendering production and real-time inference, and A2F quality improvements enhance lip sync for realistic digital human representations.

The ACE suite now supports additional languages including Italian, EU Spanish, German, and Mandarin, and has improved ASR technology accuracy. The cloud APIs simplify access to Speech AI features. The new Voice Font microservice allows customisation of TTS outputs, enabling unique voice applications in various scenarios.

ACE Agent, a new dialog management and system integration tool, provides a seamless experience by orchestrating connections between micro-services. Developers can now integrate NVIDIA NeMo Guardrails, NVIDIA SteerLM, and LangChain for more controlled and accurate responses. 

The updates make it easier to use these tools in various rendering and coding environments. New features include support for blendshapes within the Avatar configurator for integration with renderers like Unreal Engine, a new A2F application for Python users, and a reference application for developing virtual assistants in customer service.

The post Now You Can Create Lifelike Avatars with AI Animation and Speech in NVIDIA ACE appeared first on Analytics India Magazine.

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Cognizant: Cognizant CMO quits, Thea Hayden to take interim...

Global technology services giant Cognizant saw yet another...

Blockdaemon mulls 2026 IPO: Report

Other Web3 infrastructure platforms, such as Circle, are...

How to install iOS 18.1 beta

Apple released a very early preview of Apple...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!