No suggestions, other than a similar error on my end. I was curious of the performance of an OptiX code my co-worker has been working on with newer drivers 313.09 and OptiX 3.0 under Ubuntu 64-bit. Like you, my code has run fine with no issues on various other cards (GTX 285, GT 430, GT 440, GTX 680) and OptiX versions (2.1.1, 2.6)
Also, I have another NVIDIA GPU (GT 640) installed along with the K20c, which parallels your case.
With 313.09 and OptiX 3.0, running the OptiX 2.6 SDK, the code actually takes slightly longer with the GT 640/K20c combination, rather than just the 640 alone.
GT 640 ~ 1.34 sec
GT 640 / K20c ~ 1.6 sec
That alone was a bit odd.
When I try the same code with the OptiX 3.0 SDK, whether I have the GT 640 alone, or the GT 640 & K20c inserted, the error I get get is:
OptiX Error: Invalid value (Details: Function “RTresult _rtContextCompile(RTcontext_api*)” caught exception: Error locating PTX Symbol: __constant828, [1049181])
with the line that causes the error being:
RT_CHECK_ERROR( rtContextCompile( context ) );
I also tested OptiX 3.0 with the 304.64 drivers and the same issue persists.
I also tried the same OptiX 3.0 w/ 304.64 driver configuration with the GTX 285 and it works fine, actually comes out to a ~1.6x or ~3.7x speedup compared to the previous OptiX 2.6 version, depending on the version of the code that I’m running.
Strangely enough, the problem isn’t present on all Kepler GPUs:
I tried the OptiX 3.0 w/ 304.64 drivers along with a GTX 680 (GK104), and that configuration is able to execute my code just fine. With all that investigation, it seems that for me the culprit is that OptiX 3.0 does not like my GK107 GT 640.
I do not have a test system with an integrated video card, so I cannot isolate if this issue is also present when only the K20 is installed. If anyone over at NVIDIA can further debug this issue, it would be appreciated!