I am attempting to set up a remote server to run OpenGL on Nvidia GeForce RTX 2080 SUPER, on Ubuntu 18.04.5 LTS. The machine has two Nvidia cards. My eventual goal is to be able to run pybullet over ssh, but I think the issue limiting me right now is that OpenGL is running on the default Intel GPU as opposed to the Nvidia GPU.
nvidia-settings
yields
ERROR: Unable to load info from any available system
(nvidia-settings:3476): GLib-GObject- CRITICAL **: 23:49:17.890: g_object_unref: assertion ‘G_IS_OBJECT (object)’ failed
** Message : 23:49:17.894: PRIME: No offloading required. Abort
** Message : 23:49:17.894: PRIME: is it supported? no
glxinfo | grep render
yields
libGL error: No matching fbConfigs or visuals found
libGL error: failed to load driver: swrast
direct render ing: No (If you want to find out why, try setting LIBGL_DEBUG=verbose)
GLX_MESA_multithread_makecurrent, GLX_MESA_query_ render er,
OpenGL render er string: Intel® Iris™ Plus Graphics 655
glxgears
shows the gears but they are not rotating, and returns this error:
libGL error: No matching fbConfigs or visuals found
libGL error: failed to load driver: swrast
nvidia-smi
yields
±----------------------------------------------------------------------------+
| NVIDIA-SMI 455.38 Driver Version: 455.38 CUDA Version: 11.1 |
|-------------------------------±---------------------±---------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 GeForce RTX 208… Off | 00000000:0B:00.0 Off | N/A |
| 18% 32C P8 17W / 250W | 53MiB / 7979MiB | 0% Default |
| | | N/A |
±------------------------------±---------------------±---------------------+
| 1 GeForce RTX 208… Off | 00000000:0C:00.0 Off | N/A |
| 18% 30C P8 3W / 250W | 5MiB / 7982MiB | 0% Default |
| | | N/A |
±------------------------------±---------------------±---------------------+
±----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 1165 G /usr/lib/xorg/Xorg 51MiB |
| 1 N/A N/A 1165 G /usr/lib/xorg/Xorg 4MiB |
±----------------------------------------------------------------------------+
My best guess is that GLX is configured to use the intel instead of nvidia hardware, but I can’t figure out how to fix it. Do you have any suggestions?
Among many other things, I have tried reinstalling the driver, using sudo apt install nvidia-driver-455
but the problem was not fixed.
I have also tried editing /usr/share/X11/xorg.conf.d/10-amdgpu.conf and /usr/share/X11/xorg.conf.d/10-nvidia.conf (based on [nvidia-xconfig doesnt do what i want it to, nor does nvidia-settings - #91 by ikauvar](https://this thread) to no avail.
Thank you very much.
nvidia-bug-report.log.gz (423.4 KB)