How to use real-time lip sync on Unreal Engine via audio2face

Hello, I’m interesting to make real-time lip sync character In Unreal engine. I saw audio2face live lip sync and I want to use that function in Unreal engine. I found how to make this things, but I only found send animation with already recorded data. Is there anyone how to do it? or is it supported?

1 Like

This is the $64,000 question, and until now I have seen no answer to this question or seen any convincing third-party, independent demonstration of this.

See the 8K member Virtual Beings Facebook group for more on this topic.

Thanks.

Hello @tuna83! I will need to ask the dev team about this as I did not see a tutorial currently available.

I did found a video involving the Audio2Face Live Mic Mode as well as setting up a streaming audio player and included them below!

and a streaming audio player

You can also use the Riva Extension to connect to the Riva TTS server

Hi @tuna831 ,
Interesting subject. This feature surface up in internal discussions a few times recently.
Right now, there’s no option to do that, as it’s missing some features on the app side.
But we are working on adding those features now and it’s being tested internally. so hopefully this would be possible on the next release.

This is now possible using Audio2Face UE Livelink plugin, available on 2023.1.1.
See for instructions: Audio2Face 2023.1.1 (Open Beta) Released - Apps / Audio2Face - NVIDIA Developer Forums

1 Like

thank you for your service!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.