createTextureSamplerFromGLImage exception - Linux

I’m running into an issue where creating a texture sampler from a GL_TEXTURE_2D (createTextureSamplerFromGLImage) is throwing an exception, but only on Linux.

terminate called after throwing an instance of 'optix::Exception'
  what():  Invalid value (Details: Function "RTresult _rtTextureSamplerCreateFromGLImage(RTcontext, unsigned int, RTgltarget, RTtexturesampler_api**)" detected error: Not a valid OpenGL texture)

I have tried using OptiX 5.0.0, as well as going back to 4.1.1 and 3.9.1, but all throw the same exception, including the pre-compiled sample included with the 3.9.1 SDK distribution. This happens on Ubuntu 16.04 and 17.10 with the 384.111, 387.23 and 390.12 driver versions. TextureSampler creation in general works as expected, and other GL interop (createBufferFromGLBO) also works fine, the only issue seems to be with createTextureSamplerFromGLImage.

Running the simplified code below reproduces the issue on Linux, but runs and exits cleanly on OSX (378.05.05.25f04 driver), and Windows 10 (390.65 driver).

Any insights would be greatly appreciated,

#include <optixu/optixpp.h>
#include <GLFW/glfw3.h>

int main() {

    auto window = glfwCreateWindow(640, 480, "gl_texture", nullptr, nullptr);

    auto context = optix::Context::create();
    GLuint texture_id;
    glGenTextures(1, &texture_id);
    glBindTexture(GL_TEXTURE_2D, texture_id);
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 512, 512, 0, GL_RGBA, GL_UNSIGNED_BYTE, nullptr);
    glBindTexture(GL_TEXTURE_2D, 0);

    auto sampler = context->createTextureSamplerFromGLImage(texture_id, RT_TARGET_GL_TEXTURE_2D);
    return 0;

I have the same problem on my 16.04 Ubuntu system with the 384.111 driver. Can anyone from nVidia help please?

Thanks for the report. I’ve filed a bug against OptiX to have this checked.
Could you please add information about the installed graphics boards in case that matters for the repro?

Thanks Detlef. My machine is a Gigabyte P57 laptop with the GeForce GTX 1060 graphics card.

I should have updated the thread earlier, but for me, it turned out to be an issue with the mesa libGL being used by OptiX, rather than the nvidia one. Even though my application linked against the nvidia .so, and that’s what was in the rpath, because mesa on Ubuntu symlinks into /usr/lib/x86_64-linux-gnu, OptiX (via dlopen, I’m assuming) is later using the incorrect version during the createTextureSamplerFromGLImage call. Explicitly setting the LD_LIBRARY_PATH gets around the issue, but is less than ideal.

Ok, so you consider this solved?

You mentioned in the initial post that other OpenGL interop functions like createBufferFromGLBO() worked which shouldn’t be the case if the underlying OpenGL implementation was the wrong one.

Confirmed, setting LD_LIBRARY_PATH to /usr/lib/nvidia-384 resolved this problem. I get another problem now, so if I can’t find a way around this then I’ll be starting another thread :)

I agree Detlef, and wasn’t able to sort out why the GLBO functions worked as expected, but the GLImage call failed. Running strace -e openat on a simple test app, I could see that without the GLImage, the mesa library was never loaded, but it was with that call. So, there still seems to be something spurious happening with the loading, but since I was able to work around it, I haven’t revisited the root cause.