Audio2Face and Unreal Engine Live Link Connection

I am trying to develop a solution in Unreal Engine to render lipsync animation generated from Audio2Face automatically.
Is there a suitable option for this?

I wanna use Unreal Engine’s Omniverse Live Link plugin to receive facial animation and insert it into the sequence to render them.
Is this possible in the current release version?

Or I can import lipsync cache usd file from A2F into Unreal Engine via omniverse animation import option in UE. But this can be done only manually or editor utility mode, but runtime mode.

I hope I can import or insert facial animation into sequence to render in runtime.

Best regards.
Thanks for your consideration.

I’m not proficient with Unreal Engine. But if your workflow allows using geometry caches, then it’s quite straight forward. Just export geometry cache as USD from Audio2Face and import it directly into Unreal Engine.

But if you have your MetaHuman rig inside Unreal Engine and would like to import blendShape weight animations into it, then you could use Audio2Face Livelink Plugin for Unreal Engine. Here are a few videos and docs:
Audio2Face to MetaHuman Blendshape Streaming - Part 1 - YouTube
Audio2Face to MetaHuman Blendshape Streaming - Part 2 (youtube.com)
Audio2Face to UE Live Link Plugin — Omniverse Audio2Face latest documentation (nvidia.com)

But if you have a custom (non MetaHuman) character in Unreal Engine and would like to apply facial animations exported from Audio2Face to it, it gets tricky. You’d need to remap exported weights to be able to apply them to your custom character. This is something I do not know how to do. But if you’re proficient with UE, you should be able to do it.