nvidia-387.34 + glxinfo = Error of failed request: BadValue (integer parameter out of range for operation)

Ubuntu 17.10, nvidia-387.34

$ sudo glxinfo
name of display: kitt:10.0
X Error of failed request: BadValue (integer parameter out of range for operation)
Major opcode of failed request: 149 (GLX)
Minor opcode of failed request: 24 (X_GLXCreateNewContext)
Value in failed request: 0x0
Serial number of failed request: 17
Current serial number in output stream: 18

$ nvidia-smi
Thu Dec 14 09:52:51 2017
| NVIDIA-SMI 387.34 Driver Version: 387.34 |
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| 0 TITAN Xp Off | 00000000:0A:00.0 On | N/A |
| 23% 28C P8 9W / 250W | 133MiB / 12188MiB | 0% Default |

| Processes: GPU Memory |
| GPU PID Type Process name Usage |
| 0 3982 G /usr/lib/xorg/Xorg 59MiB |
| 0 4032 G /usr/bin/gnome-shell 71MiB |

I read a few threads about adding ‘+iglx’ to Xorg startup options, but that doesn’t fix it.


I suppose that you’re not running it over ssh and the display :10 is correct?
Then use
ls -l /usr/lib/libGL*
to see if there are leftover libs from an earlier install or missing/broken symlinks.
Post your Xorg.10.log to get a litle more info.

I am running it over ssh with forwarding X. I am trying to X display a Unity simulation over ssh. It’s not a crime.

Maybe not a crime but considered very bad depending on where the OpenGL rendering should take place. The way you’re doing it now would need indirect gl which is considered bad, thus disabled per default. Limited to something like OpenGL 1.4 anyway.
Look into VirtualGL or just something like vnc to have better results.

I thought +iglx would work but it doesn’t, why?

Though I think you’re beating a dead horse because of
Unity OpenGL requirements: Linux (OpenGL 3.2 to OpenGL 4.5)
Indirect GLX: OpenGL 1.4

So am I. Thanks.

Look into VirtualGL, I think that’s exactly what you want.

You got to realize, I’m just trying to X display a research simulation on my MBP from a server a few rooms away (which is headless or rather has an HDMI headless dongle in it).

I have gotten away with X displaying most applications (since I don’t do it a lot). I just feel NMX and VNC is overkill for this.

Ok, most lightweight and simple would be to use x11vnc then.

I’m trying to visualise a pointcloud over ssh from a Jetson TX2 host.VNC and Teamviewer are totally stable so far for this.

ssh -X nvidiare@

works perfectly for gedit files. But I’m getting the following error for PCD file visualisation
“X Error of failed request: BadValue (integer parameter out of range for operation)”

Any suggestions?

This particular BadValue error is generated when the X server has indirect GLX support disabled, which it does by default on recent builds.

You can reenable indirect GLX by passing the “+iglx” flag on the X server’s command line or by enabling the AllowIndirectGLXProtocol option in xorg.conf.

Oh duh, scrolling up I see you tried +iglx. I’m not sure why that wouldn’t be working – perhaps it wasn’t getting passed through to the X server correctly?

Please give the xorg.conf option a try instead.