Can I use optixTrace() to trace non-normalized scene coordinate rays?

I am building a small optix based test application in which I want to shoot a set of predefined rays from a origin in the scene and see if it actually hits the scene geometry which is in the form:

	const std::vector<float3> vertices = triangles;
	{ {
		  { -0.5f, -0.5f, 0.0f },
		  {  0.5f, -0.5f, 0.0f },
		  {  0.0f,  0.5f, 0.0f },
....
	} };

The ray destination scene coordinates(not-normalized) have this form:

const static double rayDestisnations[10000][3] = { {
 {-0.88871406, 0.4593381547, 0.00015, 0.3333333},
								 {0.344705048422, -0.98711013621, 0.00025},
								 {0.379915392598, 0.9221173794, 0.00035},
								 {-0.904980575489, -0.4254647762, 0.00045},
								 {-0.1521858, -0.8818035, 0.445},
								 ....
								 {0.71069432, 0.54358462, 0.445},
								 {-0.890590706519, 0.079434099, 0.44785},
								 {0.6031359545, -0.659975472, 0.44795},
								 {0.001037894335, 0.89855311, 0.44805},
								 {-0.6046512877, -0.6551593026, 0.4485},
								.....
	 							{-0.02804462274, 0.017699385, 0.99945},
								 {0.007890064872, -0.0289403710094, 0.99955},
								 {0.0121099887, 0.0235207539794, 0.99965},
								 {-0.0209751941736, -0.00774459355806, 0.99975},
								 {0.0160328889884, -0.00655163877853, 0.99985},
								 { 0.0099999922, 0.0, 0.99995 } 
};

I have set up an one-row launching grid of length of 10000 and in .cu file I can access each ray destination and create the ray_direction like this:

    float3 rayCoord;
    rayCoord.x = params.rayCoords[idx.x].x;
    rayCoord.y = params.rayCoords[idx.x].y;
    rayCoord.z = params.rayCoords[idx.x].z;

    float3 ray_origin = params.origin;

    ray_direction = (rayCoord - ray_origin);

Then in __raygen__rg() I run optixTrace:

    optixTrace(
        params.handle,
        ray_origin,
        ray_direction,
        0.0f,                // Min intersection distance
        1e16f,               // Max intersection distance
        0.0f,                // rayTime -- used for motion blur
        OptixVisibilityMask(255), // Specify always visible
        OPTIX_RAY_FLAG_DISABLE_ANYHIT | OPTIX_RAY_FLAG_DISABLE_CLOSESTHIT | OPTIX_RAY_FLAG_TERMINATE_ON_FIRST_HIT,
        0,                   // SBT offset   -- See SBT discussion
        1,                   // SBT stride   -- See SBT discussion
        0,                   // missSBTIndex -- See SBT discussion
        payloadMissed
    );

Since I don’t want to render an image of 10000x1 size, but simply check if any of the rays hits the geometry in my scene,
does function optixTrace() in this form do actually what I intend to do? All coordinates are in scene coordinates, do I have to transform/normalize the ray_origin and ray_direction?

I am asking this because by moving the origin point to various locations in the scene I get some unexpected miss/hit values and I wonder if I have to treat ray coordinates differently.
Thanks!

In principle the ray direction doesn’t need to be normalized but it’s highly recommended because unnormalized direction vectors open a can of worms.

So are you saying you get different hit/miss results when using
ray_direction = (rayCoord - ray_origin);
versus
ray_direction = normalize(rayCoord - ray_origin); ?

  1. Your ray destination coordinates are converted to float?
  2. For all data rayCoord != ray_origin is true?
    Otherwise you get invalid rays with null vectors or NaN values from the normalize().
    (Enable OptiX exceptions and an exception program and check if there are any invalid rays.)
  3. The length(rayCoord - ray_origin) is well below the ray tmax of distance between 1e16f?
  4. Now the kicker: Is the length too short maybe?
    That is, is length(rayCoord - ray_origin) * 1e16f too small to reach your geometry for some rays?

Just always normalize the ray direction and if that is producing incorrect results, the problem is somewhere else.