Omniverse Audio2Face and Unreal Engine 5.2

Hello all,

I hope this message finds you well.

I am working on the Meta-Human LiveLInk between Audio2Face and Unreal Engine.

I do not have any prior Python knowledge, just Unreal Engine Blueprints experience.

My question is: How do I know when the model is speaking?

So far I have managed to have a bool checker for JawOpen; thus, if the float changes, it means it talks. But the problem is that sometimes the reference for closed jaw matches with the JawOpen for a few frames and it bugs out my code.

I am open to any suggestions and ideas :)

Have a great day!

Hi, Welcome, are you following any tutorials for this setup? What workflow/use case do you have?

Hello :) Thank you for the reply.

The only tutorial I saw is just the Audio2Face Unreal Engine LiveLink Documentation.
https://docs.omniverse.nvidia.com/audio2face/latest/user-manual/livelink-ue-plugin.html

There are many detailed tutorials on youtube that you can use along with the documentation. I am sharing one : https://www.youtube.com/watch?v=5ruccgRniWs&ab_channel=SolomonJagwe

Let me know if it helps with your use case.