Hello all,
I’ve been working on a simple OptiX 7.2 ray-tracer with support for 2 types of primitives. I’m experiencing a strange graphical bug where my sphere primitives are getting shaded differently based on the direction that the camera is positioned. I suspected that this was due to my sphere intersection but I’ve tried a few other code samples with no luck.
I’m new to OptiX so I’m posting to see if someone can catch something I may be missing and/or haven’t considered in my computation of hitPoint from my custom intersection method.
My sphere is a reflective surface computing a reflected ray based on the incoming ray and the calculated normal.
Here is how I’m calculating the hitPoint… Raygen was taken from a sample program based on the camera position.
vec3f d = optixGetWorldRayDirection();
const vec3f e = optixGetWorldRayOrigin();
vec3f hitPoint = e + d * ((float)int_as_float(optixGetAttribute_0()));
Here is how I’m calculating my sphere normal based upon the hitPoint:
vec3f norm = normalize(hitPoint - center);
Let me know if anyone has any thoughts, any thoughts at all would be greatly appreciated! This is my first experience with OptiX so I’m hoping to become more comfortable with it in the future. Let me know if there are any regions of the code that I need to elaborate more on.
Thanks!