Audio2Face Unreal MetaHuman using Livelink vs Blueprint set up difference

Hey guys!

We had a project in Unreal 5.2 using Metahumans and Audio2Face to drive the facial animations through the Live link, but then moved to 5.3 using the blueprint solution (Audio2Face to UE Live Link Plugin β€” Omniverse Audio2Face latest documentation) as our end goal is a packaged application.

The quality of the facial animation on 5.3 using the blueprint solution visibly is much worse, was just wondering if this is known problem or i have done something wrong??

Thanks!

Here is a good place to start. There are lots of docs on this and you can search our youtube channel.
https://docs.omniverse.nvidia.com/audio2face/latest/user-manual/livelink-ue-plugin.html

@Richard3D Hi Richard! I have them both set up already, what i’m asking is, why does the results from the blueprint version setup in 5.3 look so much worse than the live link connection in 5.2?

And if there is a way i can fix this?

Well go with the better version. If 5.2 is better then save that project and bake out the solution. Then you can always upgrade the UE Project to 5.3 or 5.4.

I would but i need to use the 5.3 version as the application needs to be packaged (A2F can only work in 5.3 packaged) so im trying to find out if i can get the reason as to why it looks much worse to investigate and fix the problem.

1 Like

I have found that multiplying the animation effect massively can help. In the Sequencer the default is 1. Try a huge number and see if that helps.

@snahbah is there a way of doing this at runtime? I don’t use sequencer in this application, audio2face animations are being fed through in realtime

Can you please send me everything you have as a complete package we can analyse ? I want to also see a video from you showing the diffferenes you are seeing between 5.2 and 5.3. Then we can try to recreate it.