Can we use a RayTraced renderer in Omniverse, and get direct access to rays like in OptiX?

Hello,
We have some sensor simulators that have been prototyped in OptiX, but would like to take advantage of the omniverse simulated world. Do we have access to raw rays in the omniverse RTX renders?

For context: https://forums.developer.nvidia.com/t/what-options-are-there-for-a-simulated-world-with-optix-omniverse-etc/201781/4

The simple answer is No.
The render engines inside Omniverse are based on DXR resp. Vulkan Raytracing and are not programmable at that level.

There are sensor simulations in Omniverse products which shoot the sensor rays for you and you can work with the results, but I do not know if the sensor simulation itself is programmable, if you’d need that. Maybe explain your project requirements a little more.

I’ll leave that to the Omniverse simulation experts to answer.

Related thread in case you wonder if it’s possible to use OptiX as an Omniverse renderer, also no, that doesn’t exist:
https://forums.developer.nvidia.com/t/omniverse-support/181413
and there is no support for custom render delegates at this time:
https://forums.developer.nvidia.com/t/hydra-delegates/163828/3

Hello again, well here are a few requirements:

  1. Be able to control the distribution (angle) of the rays, so can have more at certain angles, etc.
  2. At each closest intersection get the range, triangle normal, material.

And camera will be moving in a world with moving mesh objects.

Hi!

I am looking for something similar, using the IRay renderer or RTX path tracing in Omniverse IsaacSim. In particular, I am looking for:

  • Configuring the primary ray directions for each pixel, to simulate lens distortion.
  • Configuring the distribution of multiple sample primary rays for a pixel to simulate depth-of-field, e.g. sample the ray origin within the lens aperture - this might already be done internally, but I don’t know how the depth-of-field rendering works.
  • Similarly, or actually, in addition, sampling the primary ray destination around the focus point to simulate lens MTF, or optical aberrations, respectively.
  • Configuring the temporal sampling of rays for each pixel to simulate rolling shutter effects.

Can this be done somehow in Omniverse, or will this be possible? I think that in DriveSim, some of these might already be done, but I am not sure.

Thank you very much!
Arne

I will create a feature request ticket so that the RTX team knows that this feature has been requested. Currently, Isaac Sim does not provide any additional APIs for the renderer.

A feature request ticket was generated from this post. OM-62780: Can we use a RayTraced renderer in Omniverse, and get direct access to rays like in OptiX?