Is Optix useful only for rendered scenes or can it be used for real world applications like medical imaging?
OptiX is a general purpose high-level ray-casting SDK, not a renderer.
If you have a visualization problem which can be solved by shooting rays, you can most likely implement it using OptiX.
For examples about what developers are doing with OptiX, take a look at the GPU Technology Conference presentations by searching for “OptiX” here:
If this about the usual medical visualization of density volume grids, there isn’t always the need for a full blown ray tracer which is optimized to find ray - geometry intersections. Some solutions simply involve ray marching which can also be done efficiently with native compute APIs. If there is geometry in the scene as well, that could use a raytracer. It depends on the use case what solution would be the best fit.
Are there any resources to understand the difference between ray-tracers and ray marching? Thank you.
A simple web search for “medical image visualization with ray marching” turns up enough articles and papers on the first page to explain these methods, even a wiki page.
I want to study light propagation through a crystal using ray tracing. Can optix work on a real crystal or can it only work on rendered objects?
You can use OptiX to simulate light propagation through a physics based model of a crystal. (And it doesn’t need to be light, you can compute any type of radiation that you can simulate and model using your own code, such as sound or heat or gamma rays, etc…) You’ll be responsible for modeling the crystal and it’s properties, and for modeling your light propagation and it’s interaction with the interfaces between media, i.e. crystal vs air. You are not required to render a picture with OptiX, you can design what you want the inputs and outputs of your simulation to look like.