Incorrect rtIntersectionDistance value in closes hit program after calling rtIgnoreIntersection()

For some reasons, I need to ignore some specific intersections. The rtIntersectionDistance value inside my closes_hit program is incorrect when calling rtIgnoreIntersection() on some intersections inside the any_hit program. This happens only if the intersected objects are scaled through the transform. The example below just showcases the bug and doesn’t represent a real processing. Here is a simple code:

struct PerRayData_radiance {
    int depth;
};

rtDeclareVariable(uint, launch_index, rtLaunchIndex, );
rtDeclareVariable(float, t_hit, rtIntersectionDistance, );

RT_PROGRAM void ray_program() {
    optix::Ray ray = optix::make_Ray(origin_buffer[launch_index], direction_buffer[launch_index], 0, minimum_distance, maximum_distance);
    PerRayData_radiance prd;
    prd.depth = 0;

    rtTrace(top_object, ray, prd);
}

RT_PROGRAM void closest_hit( void ) {
    printf("closest_hit. t_hit: %f\n", t_hit);
}

RT_PROGRAM void any_hit( void ) {
    printf("any_hit. t_hit: %f, depth: %d\n", t_hit, prd_radiance.depth);

    ++prd_radiance.depth;
    if (prd_radiance.depth > 1) {
        rtIgnoreIntersection();
    }
}

When I don’t put the rtIgnoreIntersection() call, everything works as expected; that is, the smallest distance gets printed inside the closest hit program. However when I put the rtIgnoreIntersection() call and if there are several intersections being tested inside the any hit program, abnormal values are printed in the closest hit program when the transform of the intersected objects have non-identity scale.