Using scale transforms correctly

I tried to use an OptiX transform to scale a triangle mesh, but the result isn’t correct if I chain it with other transforms like rotations.

Initially, I was scaling my meshes by multiplying their vertex coordinates by a float value(figure A). Using the transform matrix, at first, gave a similar result(figure b). However, if I rotate the mesh after scaling it with a transform, it appears inverted on the Y axis(figure C). Rotating it after scaling the mesh by multiplying its vertex coordinates gives me the correct result(figure D), though.

I’m using the optix::Matrix4x4 matrices to set my transforms on the host side, and I use the following rtTransform functions on my intersection programs:

// ...
float3 hit_point = ray.origin + t * ray.direction;
hit_point = rtTransformPoint(RT_OBJECT_TO_WORLD, hit_point);
hit_rec.p = hit_point;

// ...

hit_rec.normal = optix::normalize(rtTransformNormal(RT_OBJECT_TO_WORLD, normal));

// ...

I know we need the correct the normals when scaling with matrix operations, but I believe the rtTransformNormal does that, is that right? Am I missing something?

Ok, that was a massive slip up on my end. My Y axis was inverted because I was rotating about the wrong axis, X rather than Y. The transforms are working fine :)

Thanks for posting back with the correction. Btw, if you do think you’ve found an OptiX bug in the future, it will get attention more quickly if you reproduce it by tweaking an SDK sample like optixMeshViewer, optixSpherePP, etc. That lets us quickly build and test it on our end.