Blendshape metahuman wrong representation

Hello,
I’m new to Audio2Face and could use some assistance. I’ve been trying to use the streamlivelink to connect my Metahuman character to Audio2Face. However, I’ve noticed that the generated emotions don’t appear as expected on my character’s face. So, I’m exploring how to enhance the results in Audio2Face.
Here’s what I’ve done so far: I performed a character transfer and applied some random skin mesh fitting to create a unique blendshape, just to experiment. I then exported this blendshape and added the usdSkel to my scene. I then used the A2F Data Conversion to create my BlendshapeSolve.
The issue I’m facing is that when I play the audio in Audio2Face, the face mesh looks weird (which is the desired effect). But when I connect it to my Metahuman character, the facial movements don’t match what I see in Audio2Face, and the mouth movement appears normal.
Could someone please guide me on how to achieve the same facial expressions in my Metahuman character as I see in Audio2Face?

Capture
Capture2

I will check with our Blendshapes expert and we will get back to you

Hi @toufic.kashmar ,

Just making sure I understand this correctly, you mentioned

applied some random skin mesh fitting to create a unique blendshape,

which produce broken lips and broken looking blendshapes that you use for blendshape solve.
And you are expecting your MH character to have similar broken lips, but it looks “ok”. correct?

When communicating to UE, A2F via Livelink is only streaming the ARKit shapes coefficient (52 float values), and not the vertex deformation.
these values are then mapped to the MH controller.
So, in your example, if the solve produce jawOpen 0.5 and it looks broken in your solve, it’s still a valid 0.5 jawOpen to drive MH controller.

In A2F there are several way to fine tune the quality of motion within the scope of those 52 float values. you can adjust the A2F parameters, blendshape solve parameters, as well as the Float Array Tuner parameters exposed in A2F to drive those 52 values used for driving your character.
In UE, you can also adjust the blueprint mapping of which controller those 52 shapes should map to.

But for your question of adding custom deformation onto MH character, I’m not aware of any technique to do that, and you may have to reach out to UE team to see if they allow you to do that.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.