OptiX is a general purpose GPU ray-casting SDK. It’s not a renderer and has no notion of material properties at all. All behavior depending on some material data is implemented by you as the developer.
If you know how to implement some simulation by shooting rays, then it should also be possible to implement that with OptiX
Rays in OptiX are not traveling at any “speed”. Fundamentally they just determine which geometric primitive you intersect at what distance, or if you missed. Everything happening on such a hit or miss event is defined by your implementation of the respective GPU device programs called by OptiX for each case.
Since you can determine how far a ray traveled inside a medium with a ray tracer easily, you can also determine the time it would have taken to travel along that distance inside whatever material and track that information along each ray path. That is all yours to calculate.
The same is true for calculating the reflections on or refraction through geometric surfaces separating volumes with different properties.
You could also do ray marching and step in discrete time intervals and calculate the ray distance depending on the surrounding medium until you hit something. (Not recommended unless you’re dealing with heterogeneous mediums.)
Simulations of time it takes to travel from a transmitter to a receiver have been implemented with ray tracing in OptiX before
NVIDIA actually has a product based on that:https://news.developer.nvidia.com/vrworks-audio-dials-up-the-immersion-with-rtx-acceleration/
Here are two related posts explaining ideas how to go about that with some Monte Carlo integration.
(Maybe disregard my idea of tracking the cone angle. That should even work without, by considering the full hemisphere above a hit point.):
Related post: https://devtalk.nvidia.com/default/topic/1036110/optix/wireless-channel-characterization-using-optix-as-a-ray-tracing-engine/post/5272000