There’s an earlier post “Blendshapes for facial hair” for an earlier version of Audio2Face, but the fixes described within are no longer valid for 2022.2.1. I’ve got a character with facial hair that I’ve attached via the ‘PROX UI’ under character transfer (what used to be WRAP AI). The sync animation works fine with the the skin, and I’m able to get the facial hair to morph with PROX UI, but how can it be exported along with the mesh?
Have you tried exporting cache for the selected objects?
I have not- can this cache be manipulated in a 3D modeling program (i. e. Blender?) if so, can u explain how to export it from the app and into blender?
No geometry cache is not editable in other DCCs.
Here’s are some good A2F/Blender tutorials:
Audio2Face with Blender | Part 1: Generating Facial Shape Keys - YouTube
Audio2Face with Blender | Part 2: Loading AI-Generated Lip Sync Clips - YouTube
You should be able to create your own mesh with blendShapes, then solve the blendShape weights for them. For this you will need a good understanding of the standard workflow.
I know how to do these things. The end goal, again, is to export the synced/morphed facial hair done by the ‘PROX UI’ functionality along with the facial animations into blender. Is this possible?
Just tested this on a custom beard mesh and it worked as expected.
These are the steps I took:
- Create a beard mesh with some blendShape targets on it, e.g., up, down, left, right, front, back (it’s better if your beard shapes are matching your face blendShape targets, e.g., jaw-open, jaw-left, etc.
- Export this beard mesh with blendShapes as skelMesh. For this you will have to add a joint and skin the beard (in Pixar USD, blendshapes are part of skelMesh). In Maya I had to use Legacy Connector and export skel animation. and deleted the timesamples from usda file.
- Drag this beard file into your A2F scene. Create a duplicate of beard mesh using
Toolbox Menu -> Builtin -> Mesh -> Stamp Mesh (Default)
- Drive this duplicate using Prox UI.
- Setup a BlendShape Solve from this duplicate beard (which should now move with face) to the beard you dragged into the scene earlier. This will generate blendShape weights on the dragged beard which you can export as .json or .usd(a). This json file can be imported to other DCCs like Blender or Maya and applied to the blendShape node.
Thanks for the help.
I have found that you can also export 46 Blendshapes for the face, and 46 Blendshapes for dynamic hair (Eyebrows or Eyelashes). This was the result I got by importing the JSON file to the 2 meshes.