Decoupling the eye data in A2F Livelink with Unreal Engine

Hey guys, I was wondering if there is a way to decouple the eyes in A2F so that no information is being sent over to Unreal Engine to a Metahuman, so in real time it would be possible to animate the eyes with a BP as the eye movement in A2F is very sub par.

Currently I have a BP setup for just the eyes however only one works at a time, only streaming data from A2F or the eyes blueprint, not at the same time.

Hey dan214, can you share details about your use case and perhaps some demo/screenshot of what your current eyes BP does and how you have set it up.

Hey dmohsin, sure thing

Here is screenshots,

I also deleted all references to eyes made in A2F so that no data was sent through.
Please let me know if you need anything else.

The youtube tutorial i followed showing the eye movement and more details can be found here, i followed this without the other facial animation features like the smile →

With your use case you might want to explore the following techniques to see they are useful for your workflow

  1. Control Rig in animation blueprint seem to override any facial animation prior to its usage. You might have explore if its possible to filter/mask some of the controls in control rig so that animation from a prior source is not affected.
  2. You will have to directly manipulate the bones, in this case your can create a Blend Mask on skeletal mesh and use Layered Blend per Bone to separate out the two animations.
  3. Use pose assets and pose driver to trigger and drive the eye animation.

Do see if you make any headway with these techniques. You might get more help in Unreal Engine forums for this kind of issues.