How to get the Alien head model set up to be used with audio2face lip syncing in unreal

Hello, i can’t seem to find any information about this online, but i would like to use the Alien head model that comes with audio2face for a project in unreal engine. Is this possible? I’d love to get it set up with the same process similar to how we do the audio2face to metahuman. Is this something that’s doable?

Thanks!

Yes, very easy. Just install the UE Connector for 5.3 (we don’t have 5.4 yet) and export the A2F Alien, fully rigged up over the UE. It should work straight away. There are other posts on this forum and on discord on how to do this.

Hi Richard, thanks for the fast response! I have the UE Connector for 5.3 set up, but a little confused on how to “export the A2F Alien, fully rigged up over the UE”

I also couldn’t find any threads on the forum related to this (i may just suck at looking though)

I took a good look through discord and the forum here, and couldn’t find any instructions on how to do this, if you get a free moment, could you share a link to the instructions you mentioned? Sorry, i’m new to this, and it’s a bit confusing

Here is a good place to start. There are lots of docs on this and you can search our youtube channel.
https://docs.omniverse.nvidia.com/audio2face/latest/user-manual/livelink-ue-plugin.html

Thanks Richard, i’ll start investigating!