Can we use a RayTraced renderer in Omniverse, and get direct access to rays like in OptiX?

Hello,
We have some sensor simulators that have been prototyped in OptiX, but would like to take advantage of the omniverse simulated world. Do we have access to raw rays in the omniverse RTX renders?

For context: https://forums.developer.nvidia.com/t/what-options-are-there-for-a-simulated-world-with-optix-omniverse-etc/201781/4

The simple answer is No.
The render engines inside Omniverse are based on DXR resp. Vulkan Raytracing and are not programmable at that level.

There are sensor simulations in Omniverse products which shoot the sensor rays for you and you can work with the results, but I do not know if the sensor simulation itself is programmable, if you’d need that. Maybe explain your project requirements a little more.

I’ll leave that to the Omniverse simulation experts to answer.

Related thread in case you wonder if it’s possible to use OptiX as an Omniverse renderer, also no, that doesn’t exist:
https://forums.developer.nvidia.com/t/omniverse-support/181413
and there is no support for custom render delegates at this time:
https://forums.developer.nvidia.com/t/hydra-delegates/163828/3

Hello again, well here are a few requirements:

  1. Be able to control the distribution (angle) of the rays, so can have more at certain angles, etc.
  2. At each closest intersection get the range, triangle normal, material.

And camera will be moving in a world with moving mesh objects.