Some observations and questions. Maya live link + Create

Hi! I thought I’d share some initial impressions and questions.

  1. In Maya live link, the DOF settings don’t seem to transfer if I drive the distance parameter in Maya by connecting it to a measure tool distance dimension output. Only when I explicitly type in a number it works.
    Would be really nice to use this very standard setup for easy focus placement.

  2. Usually I use the Maya ramp node on area lights to create some interesting reflection falloff. Key to product shots with precise highlights imo, will this be supported at some point? A bitmap is not enough, a procedural approach is most welcome.

  3. Procedural noise - is there a way to use this with MDL? I know I can connect noises in Substance Designer, but they seem to be baked out as bitmaps?

Omniverse Create:

  1. Will DLSS be an option for pathtracer too at some point?

  2. Is it possible to export baked animation from a simulation (car, rbd, flow) in USD/VDB format, or will it be at some point?

  3. Will tile rendering be available at some point?

  4. For raytrace mode, how can I improve the anti aliasing to perfection? Even with all the sliders at max I still seem to get aliasing. Is there anything I can do beyond rendering at higher resolution and scale down in post?

  5. Is curve / Hair rendering possible or will it be at some point?

  6. Is point / particle rendering possible?

  7. Is Cryptomatte, rgb / mask aov on the roadmap?

  8. Will trace sets be implemented for explicit reflection/refraction linking?

I couldn’t find the info, but is versioning supported per user? Is it possible to backtrack if a team member does something unwanted applied to a USD more people are working on concurrently?

Thanks in advance! Sorry for all the questions, it’s a lot to take in the whole system design of omniverse quickly.
I see great promise of this system, especially with the niggles ironed out. Really like the collaboration aspects of this. Looking forward to Houdini live link + Support for many hydra delegates.


1 Like

Hi @hampan2,

Thank you for trying Omniverse. I’ll try and answer what questions I can.

  1. I’ll investigate and file a ticket regarding Maya DOF live link.
    2 / 3. Regarding Ramp and other native Maya shaders. Maya support is continually improving and shading is something we’re working on. Generally, we have 2 approaches to dealing with native materials. We can translate them from the source renderer to MDL. Or we parameter map them from the source to a similar MDL shader / material. Right now we’re exploring this with 3DS Max and Vray, but Maya and other applications are planned.

You have many good suggestions, many of which are on our list. Omniverse is still being developed so expect more capabilities in the new year.

If you have a specific anti-aliasing problem and are able to share the scene, it’ll help us to answer your question more specifically.

Thanks for the feedback,

Hi and thanks for the reply.

Regarding 2/3 I don’t care if they’re native or not, I’m not super happy with Maya’s noises anyway.
And perhaps a ramp can be implemented and exposed in the mdl?

I’m more after procedural shading nodes overall.


Hi @hamp

I was looking at the DOF question. You’re correct that Focus Distance does not link in Omniverse. This is possibly because there are native DOF and Arnold DOF controls. We’ll have to review this a bit more to decide the best mapping forward.

Regarding procedural shading nodes, we’re working on a shading graph editor which will allow you to build material graphs.


Thanks Frankie! Sounds promising with the shading graph editor. Is there a solution today for procedural noises, ramps etc? Some workaround / tool?

Beyond the shading graph editor, the coolest thing would of course be for the hydra delegate of omniverse being available in Houdini Solaris and for the procedural nodes available as material nodes in there.

Thanks for the response, I’m stoked about what this can become, I’m just looking for ways to start using it beyond testing.


Your option today is to use texture mapped area lights. It’s not ideal, but it should get you what you need.

Thanks for the clarification, yes, technically working but not practically for serious product viz shots where highlights are 90% of the expression. (Basically the realtime feedback of adjusting ramps on pixel level at high resolution is most of the work).
Looking forward to the procedural aspects of future Omniverse:)

On that note, to me there is another super important factor and that is the ability to focus on a small area of a high res render to see the effects of precision lighting.

Two ideas:

  1. Region render (Definitely preferred option)
  2. Support of overscan and 2D pan (this is less preferred way for this specific workflow, but can work by setting the total render res low and be able to see the feedback needed at high res.

Of course, these things are not general assertions, but rather key for product viz on especially small scale products.