Audio2Face animation into Unity

Hiya, so I’m trying to get the Audio2Face animations into a unity .fbx ideally. I have tried importing the animation to blender and then exporting as an fbx (unity doesn’t seem to pick up any animation data even with baking animation set to true, I can’t actually find the animation data in Blender either??) and also the (in preview) USD asset to import the USD into Unity directly…

Ideally I would like to import it as an fbx, so is there are way to bake the animation into the fbx so I can use that in Unity?

If I have to use the USD tools in Unity, how do I get the exported animation to work in Unity?

Any help/leads would be greatly appreciated.

Hello @discmage! You’ll be glad to know that we do have plans to support Unity in the very near future!

For now, let me ask the team what the best workflow would be to import your Audio2Face files into Unity.

1 Like

I appreciate the reply and looking into this! :)

Hi @discmage

Right now, the simplest way to get started with audio2face and Blender/Unity is to use alembic caches:

  1. export a usd cache from audio2face
  2. In Blender
    • import it in Blender and export it as an alembic cache
      • select the mesh to export or the transform hierarchy
      • make sur the time range / fps is correct
  3. In Unity
    • load the Alembic pacakge (com.unity.formats.alembic)
    • drag the .abc file to the assets folder
      • changes the import settings as needed ( ie. scale )
    • drag asset to scene
    • setup a new timeline for the gameObject (window> Sequencing > timeline)
      • drag the gameObject to the timeline and Add Clip with Alembic track

The usd package in Unity (com.unity.formats.usd v3.0) should be able to load the usd cache on timeline in a similar way as the alembic caches but when trying on Unity 2021.3 LTS it seem to work but there were errors related to threading that made the animation unplayable.

1 Like

Also trying to get my custom A2F character into Unity :-D

So I have imported the character into blender and that seems to work fine. But I can’t seem to export a rigged version of the character as Alembic?

Any thoughts on these two issues would be appreciated!

Hello @discmage,
you are right, alembic caches are mainly for caching geometry/vertex animation.

The Omniverse Blender team has improved the support for blendshapes/skelAnimation .
If you check the latest blender version on the omniverse launcher : prod-Blender 3.4 alpha USD branch

With this release, skelAnimation blendshapes are correctly imported as shape keys in blender.
From there you can export your character as .fbx and import it to Unity.

Here is a quick video of the process:

1 Like

So I have managed to get this into Unity as FBX with a bit of work…but have noticed that the blendshape conversion doesn’t seem to take into consideration anything but the mouth. So, I seem to lose all eyebrow and blinking features from the original mesh so that cannot be added to the .fbx asset.

This approach also doesn’t cover how I would get the eye, teeth or tongue in the conversion either.

Are these elements not supported yet or is there something I am missing in the process.

1 Like

I am also now in the process of figuring out how to get the eyes, tongue and lower denture animations over to blender->unity… right now the blendshape process for the face is working.

As for you issues with eyebrows and blink etc, are you have the issue where it looks like the mouth shapes are also muted(less animated) from the original animations?

If so i believe you have the same issues i did. It’s mostly because of scaling.

Can be solved by adjusting the BlendshapeSolve settings:
Weight Regularization
Temporal Smoothing
Weight Sparsity
Symmetric Pose

Just adjust the values accordingly. (i had to adjust mine to = current value/100)

1 Like

Hi, Is there any news about Unity connector after 9 months? :)

Hi @zanarevshatyan @Shahrizal

There is a Unity connector on the works,
it’s now on early access since early January:

Unity Connector EA

Is there a possibility for me to be a part of early testers?

An employee @cwardlaw said that it is possible for me to test the Unity connector in the topic below

I’ve written him in DM but didn’t get a reply.

By my opinion I’m not so bad tester, hope can give you a good feedback about the connector.

Thank you.

@zanarevshatyan Did you fill the form in the link I shared in my previous?
As far as I know you can apply there and it is not restricted.

1 Like

Yes, I’ve applied the form, waiting for their response, thank you.

UPT
I’ve got approved already :D

Maybe I’ve misunderstood something. I thought Unity connector is a plugin that can play facial animations from audio2face, but what I read is connecting to Nucleus servers, I don’t much about them. For what purposes is this plugin? Can I play facial animations from audio2face into Unity using this plugin, or it is not what I’m searching for? Thanks.

Well, the connectors always start with some basic functionality and get expanded.
I am not sure, but it seems that might not be possible to deal with the facial animation as of now.

One difficulty with Unity is their lack of support for USD and Omniverse is based on USD which makes also the connectors more complicated to develop.

1 Like

Will there be any video tutorial on how we can import eye, lower denture, tongue animation from a2f into Unity using Blender?

Cus what you wrote is very complicated for understanding of what to do with eyes and other parts, and also I’m using json weight to animate the head, can it be cooperated with eyes animation that have position constrain animating as I’ve understand?

1 Like