Objects appearing in the wrong order after scaling

I recently hit that as well when writing a custom geometric primitive intersection inside OptiX 7 and assume OptiX 6 works the same way.
Though I’m pretty sure OptiX 5 and before always normalized the ray direction.

  • The intersection program works in object coordinates. Or more precisely means for OptiX < 7 the rtCurrentRay sematic variable is in object coordinate space.
  • The optixGetObjectRayOrigin() and optixGetObjectRayDirection() are inverse transformed by the current transformation matrix. Means if there is a scaling transform over the custom geometry that ray direction is not normalized. (You have access to the optixGetWorldRayOrigin() and optixGetWorldRayDirection(), but they might be more expensive to get in the intersection program because of the required transformation.)
  • The optixGetRayTmin(), optixGetRayTmax() and the intersection distance in optixReportIntersection(tHit, …) are not touched at all. Means they are all in world coordinates because they are set by the optixTrace() normally inside the raygen and closest hit programs.

Normalizing the tHit by multiplying with the inverse object space ray direction length is actually the right thing to do when calculating the intersection distance with a normalized ray direction before.
I needed to do that because I built an ortho-normal basis with that ray direction and that was wrong with an unnormalized vector, then I hit the wrong intersection distance issue which completely screwed my lighting because the shadow rays all started at the wrong position.

The calculation fails with the given sphere intersection program with an unnormalized ray direction vector because that code is optimized for the a == 1.0 case in the quadratic formula which is only the case for a normalized vector.
See this post: https://devtalk.nvidia.com/default/topic/1030431/intersection_sphere-in-sphere-cu/