Animating keyframes on a Skeleton - Within USD Composer?

Hi there,
Can I animate keyframes on joints within the skeleton in USD Composer’s Curve Editor?
I can add a keyframe, it appears in the curves editor, but it doesn’t make the joints move. The skeleton simply ignores my keyframes.
Any way to animate the skeleton’s joints within Omniverse? Is that possible?

1 Like

I have also tried this and currently there are no easy tools for this.
A2Gesture is a tool that you can use to animate characters body in Omniverse.

Other character body animations should be done in another app like blendrer, maya, 3dmax etc…

For example in the current work project, we have characers custom animations coming as skeleton animations from blender or maya and we then combine all those with the dynamic AI body animation features of Omniverse, A2F and A2gesture.


1 Like

I also did something similar.
The animations of Audio2Gesture (Skeleton), Audio2Face (Blendshapes) and Head Rotating (Also possible with a Face Tracking Camera) it is possible to make animations like these:

However, all of these are combined in Blender and then exported into a single .USD file containing the model and animations. That’s the workflow I manage, and it’s awesome :3


Awesome!!! How did you made the eyes to look at camera all the time in the second animation clip?

1 Like

Thank you ^^!
The movement is achieved by adding a bone to both eyes in Blender, we will call those “A BONE” (If your model already has eye bones, you can use these too.).

Then, a couple of bones are added where the object to be stared at will be, we will call those “B BONE”.

Lastly, a Bone Constraint called “Damped Track” is added on the bone of each “A BONE”.

The benefits are that the eyes will remain looking towards the bones without separating from the face even though it turns in any direction (Roll, Pitch and Heading). However, you should notice that the person will always be looking towards the object, so you will have to animate the position of the two “B BONE” if you want that extra movement. Fortunately, Audio2Gesture movement is mostly suitable for keeping the eye in front without additional animation :)

1 Like

Greaet knowlege! So this solution is realtime in Composer in the end?

They can be seen in realtime in the Composer Viewport using the RTX Real Time Renderer :)
When I press Play, all the animation is already in them (Blendshapes and USD Skel Animation). My flow is:

  1. Mix all animations in Blender (Audio2Face, Audio2Gesture and Face Rotation)
  2. Export them in a single .FBX file.
  3. Convert .FBX to .USD in Composer.
  4. Render in USD Composer with the Movie Renderer in a .mp4 file.

It seems that there is another flow in which everything works in real time, where the animations are created as the audio file is entered using Audio2Face Rest API. My flow is “Offline” even though I can see the results in real time.

1 Like

@VanillaLake interesting approach! and looks like the result speaks for themselves. thanks for sharing the process 👍

1 Like

Yes, thank you! We are studying the example scenes of Anim Graph to use those methods to make this thing animated at realtime in Composer.

But this is very good information!