Implementation of a projector

Hi,

I was wondering if there is an example of how to implement a projector in the optix environment. I have already taken a look into the optix examples from the SDK and the advanced samples.
Is there any approach of how to implement a projector (direct lighting with image prjection)?

Thanks for your answer.

What do you mean by “image projection” ? Something we call in Germany a “beamer” ? So inside your scene something like a virtual beamer, which projects an image to a virtual wall?

In the samples all rays are shot from the camera (that is called path tracing).

That is actually a pretty forward use of linear algebra. All of the below is dry coding. Not tried myself.

Assuming you have a rectangular texture which should be projected by a point light.
This light is a singular light (infinitely small) so it’s only usable in direct lighting.

Prerequisites for a projector light definition:

  • A position of the light in world space.
  • A left-handed orthonormal-basis which orients the projected texture in the world. (Orientation can also be expressed by a Quaternion on the host.)
    Let’s name the vectors U, V, W and all three are normalized.
    W is the direction from the light position to the center of the rectangular texture.
    U and V span the infinite plane on which the texture lies.
    (Same layout as the pinhole camera in my OptiX introduction examples, but normalized vectors to simplify the following calculations.)
  • Because using an orthonormal-basis for convenience, you need two more positive float variables uMax and vMax which define the extents of the upper left quadrant of the rectangular texture quarter on the projection plane in W distance (= 1.0f units) away from the light position. With these you can scale the projection as you like. (That could also be done with the UV vectors but let’s keep it simple.)

Now sampling that projector light can be done this way:

  • Calculate the vector from light position to surface point to be lit in world coordinates (same as for any positional light source).
  • Normalize that vector, let’s call it L_world.
  • Here you should already check for early exits to improve performance.
    For example you can do a spot light cone check: If the angle between W and L_world is bigger than the angle between W and the normalized world vector to the corner of the texture on the projection plane, the texture is not sampled and the light is black. That’s just a comparison of two cosine values from dot products, the first calculated at runtime, the one to the corner can be pre-calculated.
  • If the vector is inside the cone enclosing the texture rectangle, project that vector L_world into the UVW orthonormal-basis of the light orientation, (dot product which each of the UVW vectors).
    The resulting normalized vector L_local is the direction from light position to surface hit point in local light coordinates (where the light position is the origin).
  • Now calculate the intersection of the L_local vector on the projection plane at the tip of W which is at z == 1.0f in local light coordinates.
    Means solve t * L_local.z = 1.0f <==> t = 1.0f / L_local.z
    L_local.z cannot be 0.0f and must be positive due to the previous early exit check with the spot cone.
  • Now calculate the intersection point on the projection plane:
    float u = t * L_local.x;
    float v = t * L_local.y;
  • If (fabsf(u) <= uMax && fabsf(v) <= vMax) then the intersection point is inside the texture rectangle area, else the light is black.
  • Calculate the normalized texture lookup coordinates
    u = (u / uMax) * 0.5f + 0.5f; // In range [0.0f, 1.0f]
    v = (v / vMax) * 0.5f + 0.5f; // In range [0.0f, 1.0f]
  • Fetch the texture color at that (u, v) coordinate and multiply with your light intensity.
  • Use that intensity with the proper squared distance attenuation and the L_world vector for the lighting calculations on the surface as needed.

That’s all.

Thanks a lot for your detailes answer.

Everything worked fine for me.