I want to allow users to talk to a MetaHuman avatar in real-time using ChatGPT 4.0 and Audio2Face for lip sync and need to know the optimal way to set it up on an iOS mobile app to avoid latency issues. Is there any example iOS app that already does this that I can reference to see how well it performs? I saw the Hippocratic AI demo, but that appears to be streamed in the web browser.
My app is already built using native iOS, so I can’t use Unity or Unreal Engine. It seems that streaming would be the only option, but I’m wondering if it makes more sense to stream the entire avatar from an NVIDIA server or just the animation data.
Does ACE provide the ability to easily stream a MetaHuman (either the entire avatar or the Audio2Face animation data for blend shapes) in real-time on an iOS mobile app without using Unreal Engine or Unity? If not, is it possible to set up a photorealistic real-time avatar from MetaHumans in an iOS app? I couldn’t find any documentation for this.
I’m curious why Nvidia support reps are selective in terms of who they respond to? I noticed that alot of questions on the ‘digital humans’ forum go unanswered. Is there another way to get support other than these forums?
I suspect you are looking at a cutting edge application. The people who have got latency down probably won’t just share their knowledge - it cost them a lot to create it. I would go into such a project expecting to have to do a lot of self-research.
Also I would check out licensing. Metahumans I have heard are only licensed for use on Unreal Engine, but I am not a lawyer and have not read the licensing agreement. That may explain the lack of documentation or examples.
Streaming video or streaming animation both I suspect should work. They may have different cost structures. Streaming animation will need less data transfer and less GPU power in the cloud. Lots of questions come up however like how many concurrent users are you going to have? Are users on wifi or cellular? etc.
Are you a paying NVIDIA ACE customer? (I am not.) They may have direct support channels. I don’t think there is any guaranteed support for non-paying customers. I also suspect the more specific the request, the more likely to get a response.
Have you checked out the NVIDIA ACE documentation? NVIDIA ACE — ACE documentation latest documentation - the first diagram shows rendering in the cloud is possible.
Thanks for sharing your thoughts. I’m not a customer yet.
Yes I will continue reading through the documentation.
For a company as big as NVIDIA, I would have thought they’d offer prompt support to everyone looking to implement their technology/ cloud services and I imagine this would ultimately translate into paying users.