Audio2Face stand alone application deployment

Dear all,

Greetings, I am creating a ChatGPT application with python recently and I am interested exploring the A2F from Omniverse. I am completely new on this. I would like to create an talking chatbot with my streaming wav. file. However, I cannot find any document or tutorial about the deployment of A2F to a stand alone application. Although I have heard about ACE, I would be appreciated if anyone can show me guidance on exporting the A2F file to an application? Does it means we need to export the files to editor engines like Unreal or Unity? Thanks for your time.

Hi there,

Can you elaborate on this a little? Is this for a phone app, a desktop app or a web app for example?

You should be able to use Rest API and WebRTC kit extension to stream videos in any web browser.
WebRTC Browser Client — Omniverse Extensions documentation (nvidia.com)

Creating a custom app can be achieved using ACE Omniverse Avatar Cloud Engine (ACE) | NVIDIA Developer.

While it’s still not available to public, you may register for an early access here: NVIDIA Omniverse ACE Early Access Program | NVIDIA Developer

You will probably need to convert text to audio which can be achieved using Riva: Speech AI SDK - Riva | NVIDIA

Sure! I am working on a caring robot for elderly, to help them pick stuffs from point to point. Which I use ROS and python code for the robot and chatGPT api. Therefore, I want a stand alone interface for demonstrating the face of chatGPT (I use Flet library to create a python UI). Is it the way for me to use Rest API and WebRTC kit extension in order to stream A2F interface on my python UI? If so, I may need to consider adding a web client on my UI.

To be clear, I have done a ROS2 robot which can be controlled by ChatGPT response. So I want to give a A2F interface to the voice create by GPT’s text2speech. Thank you so much for reading.

For now, I am generating a wav. file from python scripts to create a voice for GPT’s response. I then stream the wav. file into A2F using test.client.py functions. Which is far from putting it into the robot.

what I want to do is like Violet.