I am a beginner in Optix and I am bit confused by the RTX support.
I have a computer (running windows 10) a 2080Ti (with driver 418.81) and I would like to make sure that the Optix samples I compiled (with Optix 6.0) take advantage of the RT Cores.
So, I modified the samples with the following codes before the call to
int RTX = true; // try once with true then false to see performance difference if (rtGlobalSetAttribute(RT_GLOBAL_ATTRIBUTE_ENABLE_RTX, sizeof(RTX), &RTX) != RT_SUCCESS) printf("Error setting RTX mode. \n"); else printf("OptiX RTX execution mode is %s.\n", (RTX) ? "on" : "off");
For example, I tried the Path Tracer once with RTX = true and then with RTX = false. I expected to see a massive difference in performance and both run at ~16 fps (I raise the number of samples per pixel to 4) when slightly rotating the camera.
So my question is, if I disabled the RTX “support” in one case and got the same performance in the other case, it probably means that either the RT Cores are always used or never used.
Can you clarify what is going on here?