Nvidia+GLX+OpenGL vertex buffer object

Hello.

I would like to use VBOs () in OpenGL in a X11 / Linux context : OpenGL program is running on a Linux PC without a graphics card (“PC-user”), which communicates over the network with an another PC where Xorg server is running with the GLX extension (“PC-server”).

The “PC-server”, responsible for drawing, has a recent NVidia card (OpenGL 4). “PC-client” sends its requests through a display X11/GLX/OpenGL thanks to an export DISPLAY=”PC-server:0”.

According to the GLX1.4 documentation, located on the site of OpenGL, we can not use more than OpenGL 1.3 with GLX 1.4. However, based on extensions, its becomes possible to use newer features (such as VBO with this extension: GL_ARB_vertex_buffer_object).

My problem is:
If my program is running on “PC-server” and if I use the DRI extension, GL_ARB_vertex_buffer_object is found and can be used. But if my program is running on “PC-client” (or if I forbid to OpenGL to use DRI on “PC-server), this extension disappears.
Why? Is it due to limitation of GLX? How can I use VBOs on a remote X11 server PC ?

Another question: on the NVidia web site (http://developer.nvidia.com/nvidia-graphics-sdk-11), I found an OpenGL SDK, but only on Windows. Is there a version of SDK for Linux? I wasn’t able to find it.

Thank you in advance for yours answers.

Hi,

I can’t help you with your problem, but I am really interested in your configuration.
So you can access the result of an opengl/cuda hardware that is on another computer through X? That is nice!
Can you still use the interoperability cuda/opengl (shared buffer) on “PC-server” which is not displaying? Is it transparent, and everything is working like if the display was on the same machine?

Thanks.

– pium