Integrating Character Creator 4 Characters with Audio2Face for Lip Sync in Unreal Engine 5

Hello everyone,

I’m currently working on a project where I’m using Character Creator 4 (CC4) to design characters, and I’d like to achieve realistic lip-sync animations within Unreal Engine 5. I’ve noticed that many people have successfully integrated Audio2Face with Metahuman characters to create dynamic facial animations, especially for lip-syncing purposes.

However, my question is whether it’s possible to do something similar with characters created in CC4. Can CC4 characters be integrated with Audio2Face, and then used in Unreal Engine 5 to achieve lip-sync animations? If anyone has experience with this or knows how to set up such a workflow, I’d greatly appreciate your insights and advice.

Thank you in advance for your help and looking forward to any suggestions or guidance you can provide.

Yes Audio2Face and Reallusion are quite compatible. Please take a look at this beginner tutorial Getting Started with iClone NVIDIA Audio2Face Plugin | iClone Tutorial (youtube.com)

I’m trying to do the same thing as the OP. The tutorial posted above isn’t really relevant since it only addresses A2F integration with iClone, not Unreal Engine 5.

I have Live Link working between A2F and Metahuman in UE5. Now I want to replace the Metahuman with a CC4 Character and I’m not sure what needs to be done.

Hi Lgraync, Which version of UE5 are you using and which Facial Profile from within CC4 are you using?