Shadow program exception

I am adding shadow effect to my existing OptiX rendering, but the program triggered the exception “0x3FC” (stack overflow) and shows the bad_color on the screen. My closest_hit program will call the following function:

__device__ inline float3 phongModel(float3 point, float3 dir, float3 world_normal, float3 LightPosition) {
    float3 result = make_float3(0.0f, 0.0f, 0.0f);
    float3 color = make_float3(1.0f, 1.0f, 1.0f);
    result += 0.2 * color;
    float3 L = normalize(LightPosition - point);
    float NdotL = dot(world_normal, L);
    if (NdotL > 0.0f)
        PerRayData_shadow shadow_prd;
        shadow_prd.attenuation = 1.0f;
        Ray shadow_ray(point, L, shadow_ray_type,
                                 /*scene_epsilon*/1.e-4f, length(L) );
        rtTrace(top_shadower, shadow_ray, shadow_prd);
        float light_attenuation = shadow_prd.attenuation;
        if(light_attenuation > 0.0f) {

          result += 0.6 * color * NdotL;

          // Specular lighting.
          float3 H = normalize(L - dir);
          float NdotH = dot(world_normal, H);
          if (NdotH > 0.0f)
            result += 0.2 * color * pow(NdotH, specular_exp);
    return result;

It works well without shadow, i.e. without the line “rtTrace(top_shadower, shadow_ray, shadow_prd);”. By printing out some messages, I found out the program never go beyond this line. Can anyone give a clue what the bug is?


My system configuration:
LinuxMint 17.1 (rebecca) 64bit.
NVIDIA driver version 346.46.
Nvidia GeForce GTX 760
OptiX 3.8.0
CUDA 7.0
C++ compiler: gcc 4.8.2 (Ubuntu 4.8.2-19ubuntu1)

You need to increase your stack size in order to handle the extra shadow ray. See the documentation for rtContextSetStackSize.

That works! Thank you very much!
A couple of more questions:

  1. How large the stack size I should use? I am now doubling the size using the following code.
RTsize org_stack_size = m_context->getStackSize();
m_context->setStackSize(org_stack_size * 2);
  1. The lighting is noisy with many black dots on the originally smooth surface. Is this a numerical error or sampling issue?

Predicting the exact size you need is difficult because OptiX does a lot of code optimization internally. However, as long as you’ve got this at a working value, you’re fine.

It could be either. Try increasing your scene epsilon to check for numerical error. I also notice that your t_max is set to 1 (because length of a normalized vector is always 1). Depending on the size of your scene, a longer ray could reduce sampling error.

1.) You should try to keep your stack size as small as it is sufficient for your needs. It needs GPU memory and smaller is often faster.

2.) That’s most likely shadow acne from self intersections due to floating point precision issues.
Means you hit the same primitive you started a shadow ray from and got a shadow.
Artifacts from that normally look like rings on flat surfaces with a pinhole camera. For fun, try setting scene_epsilon to 0.0f.

There are multiple solutions to overcome this. The simplest is to increase the scene_epsilon value which is used to set the “tmin” ray parameter to offset the start point more from the origin.

The word “scene” in that name is intentional because that value is very scene size dependent. You normally need to increase that value for scenes with bigger coordinate sizes and when you’re far away from the viewed objects.

If this happens with an area light geometry, make sure you’re not actually considering an area light hit as a shadow blocker. You will either need to shorten the shadow ray distance a little or ignore area light geometry hits inside the shadow ray’s any_hit program if you don’t require light source geometries themselves to throw shadows.

Edit: nljones is right, get the length(L) for the ray’s tmax value before you normalize it to calculate the NdotL.

Self intersection avoidance with primitive IDs would be another option but that’s a little involved and needs some tricks to work reliably with instances. Change the scene_epsilon first. ;-)

You both are right. Increasing the “scene_epsilon” to 1.e-2f solves the problem. My triangle mesh are from the marching cube algorithm, so the triangles all have a size close to 1, so 1.e-4 might be too small.

Yes, the length(L) is definitely another bug. I fixed it before posting, but forgot to update the posted code.

Thank you both nljones and Detlef Roettger for the very quick and detailed reply!