Questions regarding exporting Unity animation to NVIDIA Omniverse

Hi. I do CG cartoons as a hobbyist in Unity, in HDRP. (Okay, I get more distracted learning all the technology than finishing projects!) I have not used NVIDIA Omniverse yet so newbie questions are coming! I was watching What You Need to Know About the New Unity Connector for Omniverse - YouTube and I had some questions about what I could expect to move across into Creator.

Note: I am guessing that most of these DO NOT come across, so you can consider this a feature request list if the idea is for me to use NDIVIA Omniverse to render the animated cartoons instead of Unity. From the video my understanding is the support is more for moving models across, not the rest of the animation infrastructure. So it may be ā€œnoā€ for all of the following.

  • It was mentioned shader graphs do not come across. Custom shaders are a common way foliage (trees, bushes, etc) sway in the wind (using vertex animation in the shader). Is there a solution for that?
  • Can shaders be mapped based on names? E.g. I have shader graphs for skin, for eyes, for hair etc. If they donā€™t come across, can I instead define a mapping rule?
  • Is texture tiling supported? I use tiles for the face texture for blush etc. at the moment. Do I need to redo all of that as well?
  • Would cloth simulation come across? I am using the MagicaCloth 2 extension in Unity.
  • I was trying to understand if hair bones would come across (I am using fairly low quality models).
  • Are Timelines brought across? (Do I have to redo all the animation sequencing again?)
  • Unity has a Sequences package designed to support a hierarchy of Timeline objects for animation sequencing. It adds complexity as it enables/disables objects based on the currently selected sequence. If an object is disabled, will it be brought across? E.g. a sequence may add extra objects to the scene just for that shot (sequence).
  • Are Cinemachine cameras supported? You can define camera settings then blend between them, or have a camera track a target (follow and/or look at).
  • Are animation clip (*.anim) files brought across? Or do they need to go into FBX files?
  • Are camera settings brought across? Or do they all need redoing?
  • Is there camera lens flare support?
  • Is fog brought across?
  • Are HDRP volumetric clouds supported in some way? (HDRI Sky I could use instead.)
  • For things not brought across, is it possible for me to write a script to add these upon import to avoid redoing it all by hand? E.g. if I carefully name assets can scripts notice ā€œoh, it has ā€˜treeā€™ in the name, so add wind animation support after importedā€. Hair has standard bone names - can I write a script so whatever Creator needs gets applied on each sync from Unity? Or do I have to reapply these each time a change is made? Or are they merged into the USD file in separate layers etc?

I suspect I should just stick with Unity and HDRP for rendering - but it gives me a lot of grief. But my feeling is itā€™s probably a big job to bring everything across. Thanks for any feedback! I am happy to expand upon any of the above if helpful.

Just to follow up with a bit more feedback on actual experience using it, I wrote up a blog with some screenshots etc. Nice, but a few problems for my default scene (such as no terrain support).

1 Like

Hi @alan.james.kent ,

Thank you so much for joining the forums and asking comprehensive questions in addition to your blog!

I will do my best to respond to you questions and bring some insights for you and the community.

  • It was mentioned shader graphs do not come across. Custom shaders are a common way foliage (trees, bushes, etc) sway in the wind (using vertex animation in the shader). Is there a solution for that?

There is not a solution for Vertex Animation currently. This is something weā€™d like to do but is not our current focus, with that said the more community feedback and requests we get from our Enterprise customers and the community this will impact our priorities.

  • Can shaders be mapped based on names? E.g. I have shader graphs for skin, for eyes, for hair etc. If they donā€™t come across, can I instead define a mapping rule?

We are working and developing on a material mapping tool that will be configurable by users for handling custom shader graph mappings for conversion IO with Unity and USD. Keep in mind in some case there could be data/quality lost in the mapping process. Stay tuned for more updates.

  • Is texture tiling supported? I use tiles for the face texture for blush etc. at the moment. Do I need to redo all of that as well?

UV texture tiling should work as expected. If you are having a texture tiling issue please post a separate forums post specific to that topic or submit a bug report with any much details, pictures, video, sample files etc and weā€™ll take a look.

  • Would cloth simulation come across? I am using the MagicaCloth 2 extension in Unity.

Currently not supported. This is something weā€™d like to do but is not our current focus, with that said the more community feedback and requests we get from our Enterprise customers and the community this will impact our priorities.

  • I was trying to understand if hair bones would come across (I am using fairly low quality models).

We are working on adding support for Skinned Mesh vertex weights and bones support itā€™s possible this will solve your needs. If you an have an example file you can share for us to test with your setup that would be great for us to validate your request with our development.

  • Are Timelines brought across? (Do I have to redo all the animation sequencing again?)

Animation and timeline sequence data is not currently supported. It is something we are interested in doing or hopefully leveraging the work started by the Unity USD SDK since they have an implementation but we cannot speak to if it meets your needs.

  • Unity has a Sequences package designed to support a hierarchy of Timeline objects for animation sequencing. It adds complexity as it enables/disables objects based on the currently selected sequence. If an object is disabled, will it be brought across? E.g. a sequence may add extra objects to the scene just for that shot (sequence).

Animation and timeline sequence data is not currently supported. When we begin investigating this feature set and implementation weā€™d be in better position to answer.

  • Are Cinemachine cameras supported? You can define camera settings then blend between them, or have a camera track a target (follow and/or look at).

Cinemachine cameras are not supported at this time and is currently not something we are looking at this year. However, we are open to hear more from our community and customers on how important such a feature is and this could influence our development plans.

  • Are animation clip (*.anim) files brought across? Or do they need to go into FBX files?

Animation clip data is not currently supported. However, Animation clip and UsdSkel support are something we are interest in and plan to develop support for later this year. Along with Blendeshape support for facial animation and tools to streamline workflows with Omniverse Audio2Face and remove our dependency on FBX. As currently FBX is the only Animation IO we support with Unity and Omniverse.

  • Are camera settings brought across? Or do they all need redoing?

Most Camera settings are support on Export to USD. We plan to also add Import USD Camera support. In our next release coming out in May we will include some fixes and improvements to Camera settings on the export. Please test and if Camera settings are not accurate please file and bug and weā€™ll investigate.

  • Is there camera lens flare support?

Camera Lens flares and general VFX support is not something we are looking at this year. However, we are open to hear more from our community and customers on how important such a feature is and this could influence our development plans.

  • Is fog brought across?

Fog and general VFX support is not something we are looking at this year. However, we are open to hear more from our community and customers on how important such a feature is and this could influence our development plans.

  • Are HDRP volumetric clouds supported in some way? (HDRI Sky I could use instead.)

We are looking at support for Exporting HDRI Skybox to USD this year. However not Volumetric clouds and other VFX.

  • For things not brought across, is it possible for me to write a script to add these upon import to avoid redoing it all by hand? E.g. if I carefully name assets can scripts notice ā€œoh, it has ā€˜treeā€™ in the name, so add wind animation support after importedā€. Hair has standard bone names - can I write a script so whatever Creator needs gets applied on each sync from Unity? Or do I have to reapply these each time a change is made? Or are they merged into the USD file in separate layers etc?

It should be possible to write a script and leveraging USD layers instead of all data on flat stage would be an ideal USD workflow. This is a bit of a complicated question but in general a developer should be able to script/code anything in Unity or Omniverse to automate workflows and data. :-D

1 Like

Thanks for the replies. It all makes sense, thanks for the updates.

Re texture tiling, I have not noticed a problem so far, but I have not tried the more advanced used cases yet - still learning. Working out how to adjust the offsets is on my long list of todo items!

Not surprised about cloth - was just checking. It feels like ideally I want layers that are applicable to one platform or the other, so I can keep Unity and Omniverse cloth information in the one USD file (but ignored by platforms other than the intended recipient).

Here is a sample model I am using. It has problems when working with audio2face as the meshes are not grouped as needed, so another job on my list is to try and reorganize the mesh splits. (I tried the supplied tool to split meshes, but it crashed.) This is a native character from VRoid Studio (free software from Pixiv for creating 3D anime style avatars). Sendagaya_Shibu.glb - Google Drive

I am going for free tools in general - I wanted to share projects that say high school students could do, so ā€œfreeā€ is the right price point! ;-)

Re animation - okay, thanks. I think that is an area I am going to have to look into more deeply. I use lots of animation clips with blends between different clips. The Sequencer in Omniverse I donā€™t think supports blends for clips however (another item on my todo list to check off). Unity Sequences would be very different in Omniverse (separate USD files referencing layer stacks etc., one per shot. Bi-directional I think would be hard for that). But without blends, that is a big gap for me personally. (I value blends more than syncing with Unity).

Unity Cinemachine does a fair bit, again more than I personally need - but it is very useful (look at target, track with target, zoom in/out over time, pan). There are some camera movement parts in Omniverse, but it seems more patchy. That is one area I am looking into at the moment. You can move a camera, there are libraries to ā€œlook atā€ targets, so its all possible - just needs a bit of wrapping code to make it more friendly.

I will report any camera issues I come across. None yet (but I have cameras in Unity inside the Sequences hierarchy, so they are not brought across cleanly - I would probably have to redo them all).

Like the last point mentions, I can imagine writing my own glue code to bring things across would be a bit tricky. But my alternative is to redo a lot of work. I know what parts I want to bring across, so can hack it to just do the minimum I want (unlike you, who have to satisfy many users). So an extensible architecture to let me plug in things would be useful. I can mock things up quicker, then wait for you to implement the full solution. E.g. tell it to ignore the Unity Sequence hierarchy for now since it probably wonā€™t get it right. But I know that is hard in its own right.

I look forward to seeing your progress!

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.