Hello everyone,
I’m currently working on a project where I’m using Character Creator 4 (CC4) to design characters, and I’d like to achieve realistic lip-sync animations within Unreal Engine 5. I’ve noticed that many people have successfully integrated Audio2Face with Metahuman characters to create dynamic facial animations, especially for lip-syncing purposes.
However, my question is whether it’s possible to do something similar with characters created in CC4. Can CC4 characters be integrated with Audio2Face, and then used in Unreal Engine 5 to achieve lip-sync animations? If anyone has experience with this or knows how to set up such a workflow, I’d greatly appreciate your insights and advice.
Thank you in advance for your help and looking forward to any suggestions or guidance you can provide.