Hiya, so I’m trying to get the Audio2Face animations into a unity .fbx ideally. I have tried importing the animation to blender and then exporting as an fbx (unity doesn’t seem to pick up any animation data even with baking animation set to true, I can’t actually find the animation data in Blender either??) and also the (in preview) USD asset to import the USD into Unity directly…
Ideally I would like to import it as an fbx, so is there are way to bake the animation into the fbx so I can use that in Unity?
If I have to use the USD tools in Unity, how do I get the exported animation to work in Unity?
Right now, the simplest way to get started with audio2face and Blender/Unity is to use alembic caches:
export a usd cache from audio2face
import it in Blender and export it as an alembic cache
select the mesh to export or the transform hierarchy
make sur the time range / fps is correct
load the Alembic pacakge (com.unity.formats.alembic)
drag the .abc file to the assets folder
changes the import settings as needed ( ie. scale )
drag asset to scene
setup a new timeline for the gameObject (window> Sequencing > timeline)
drag the gameObject to the timeline and Add Clip with Alembic track
The usd package in Unity (com.unity.formats.usd v3.0) should be able to load the usd cache on timeline in a similar way as the alembic caches but when trying on Unity 2021.3 LTS it seem to work but there were errors related to threading that made the animation unplayable.
So I have managed to get this into Unity as FBX with a bit of work…but have noticed that the blendshape conversion doesn’t seem to take into consideration anything but the mouth. So, I seem to lose all eyebrow and blinking features from the original mesh so that cannot be added to the .fbx asset.
This approach also doesn’t cover how I would get the eye, teeth or tongue in the conversion either.
Are these elements not supported yet or is there something I am missing in the process.
Maybe I’ve misunderstood something. I thought Unity connector is a plugin that can play facial animations from audio2face, but what I read is connecting to Nucleus servers, I don’t much about them. For what purposes is this plugin? Can I play facial animations from audio2face into Unity using this plugin, or it is not what I’m searching for? Thanks.
Will there be any video tutorial on how we can import eye, lower denture, tongue animation from a2f into Unity using Blender?
Cus what you wrote is very complicated for understanding of what to do with eyes and other parts, and also I’m using json weight to animate the head, can it be cooperated with eyes animation that have position constrain animating as I’ve understand?