Hi. I do CG cartoons as a hobbyist in Unity, in HDRP. (Okay, I get more distracted learning all the technology than finishing projects!) I have not used NVIDIA Omniverse yet so newbie questions are coming! I was watching What You Need to Know About the New Unity Connector for Omniverse - YouTube and I had some questions about what I could expect to move across into Creator.
Note: I am guessing that most of these DO NOT come across, so you can consider this a feature request list if the idea is for me to use NDIVIA Omniverse to render the animated cartoons instead of Unity. From the video my understanding is the support is more for moving models across, not the rest of the animation infrastructure. So it may be “no” for all of the following.
- It was mentioned shader graphs do not come across. Custom shaders are a common way foliage (trees, bushes, etc) sway in the wind (using vertex animation in the shader). Is there a solution for that?
- Can shaders be mapped based on names? E.g. I have shader graphs for skin, for eyes, for hair etc. If they don’t come across, can I instead define a mapping rule?
- Is texture tiling supported? I use tiles for the face texture for blush etc. at the moment. Do I need to redo all of that as well?
- Would cloth simulation come across? I am using the MagicaCloth 2 extension in Unity.
- I was trying to understand if hair bones would come across (I am using fairly low quality models).
- Are Timelines brought across? (Do I have to redo all the animation sequencing again?)
- Unity has a Sequences package designed to support a hierarchy of Timeline objects for animation sequencing. It adds complexity as it enables/disables objects based on the currently selected sequence. If an object is disabled, will it be brought across? E.g. a sequence may add extra objects to the scene just for that shot (sequence).
- Are Cinemachine cameras supported? You can define camera settings then blend between them, or have a camera track a target (follow and/or look at).
- Are animation clip (*.anim) files brought across? Or do they need to go into FBX files?
- Are camera settings brought across? Or do they all need redoing?
- Is there camera lens flare support?
- Is fog brought across?
- Are HDRP volumetric clouds supported in some way? (HDRI Sky I could use instead.)
- For things not brought across, is it possible for me to write a script to add these upon import to avoid redoing it all by hand? E.g. if I carefully name assets can scripts notice “oh, it has ‘tree’ in the name, so add wind animation support after imported”. Hair has standard bone names - can I write a script so whatever Creator needs gets applied on each sync from Unity? Or do I have to reapply these each time a change is made? Or are they merged into the USD file in separate layers etc?
I suspect I should just stick with Unity and HDRP for rendering - but it gives me a lot of grief. But my feeling is it’s probably a big job to bring everything across. Thanks for any feedback! I am happy to expand upon any of the above if helpful.