OpenGL uses the Mesa instead of NVIDIA on my remote machine

Hello everyone, sorry for bothering you. I know there are tons of post like this, but I’ve tried almost everything but it still just cant work.

Becuase it’s remote machine without monitor, i use Xvfb to be the virtual display server.

The issue is that glxinfo tells me it use the implement of MESA but not nvidia

(I post the test version of these screenshot below if anyone needs it)

Hers’s the output of nvidia-smi

Here’s the output of inxi

Here’s the output of lshw -c display

Here’s the output of prime-select


Here’s my xorg.conf

Here’s Libs contains GL

More information that may help to track the issue

  1. I’ve use apt uninstall to uninstall a lot of libs of MESA
  2. I recognize the issue when I want to create a X11 context, found it that requires “swrast”

Thanks for any advice!

nvidia-bug-report.log.gz (311.9 KB)

Here’s the text of my screen if anyone need it:
glxinfo -B:

name of display: :99
display: :99  screen: 0
direct rendering: Yes
Extended renderer info (GLX_MESA_query_renderer):
    Vendor: Mesa/ (0xffffffff)
    Device: llvmpipe (LLVM 13.0.1, 256 bits) (0xffffffff)
    Version: 22.0.5
    Accelerated: no
    Video memory: 31797MB
    Unified memory: no
    Preferred profile: core (0x1)
    Max core profile version: 4.5
    Max compat profile version: 4.5
    Max GLES1 profile version: 1.1
    Max GLES[23] profile version: 3.2
OpenGL vendor string: Mesa/
OpenGL renderer string: llvmpipe (LLVM 13.0.1, 256 bits)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 22.0.5
OpenGL core profile shading language version string: 4.50
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile

OpenGL version string: 4.5 (Compatibility Profile) Mesa 22.0.5
OpenGL shading language version string: 4.50
OpenGL context flags: (none)
OpenGL profile mask: compatibility profile

OpenGL ES profile version string: OpenGL ES 3.2 Mesa 22.0.5
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20


Sat Nov 12 02:09:02 2022
| NVIDIA-SMI 515.65.07    Driver Version: 515.65.07    CUDA Version: 11.7     |
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|   0  NVIDIA GeForce ...  On   | 00000000:65:00.0 Off |                  N/A |
|  0%   38C    P8    33W / 350W |     19MiB / 24576MiB |      0%      Default |
|                               |                      |                  N/A |

| Processes:                                                                  |
|  GPU   GI   CI        PID   Type   Process name                  GPU Memory |
|        ID   ID                                                   Usage      |
|    0   N/A  N/A       781      G   /usr/lib/xorg/Xorg                  9MiB |
|    0   N/A  N/A       876      G   /usr/bin/gnome-shell                8MiB |

inix -G

  Device-1: NVIDIA GA102 [GeForce RTX 3090] driver: nvidia v: 515.65.07
  Display: server: X.Org v: driver: X: loaded: nvidia
    unloaded: fbdev,modesetting,nouveau,vesa gpu: nvidia resolution: 640x480
  OpenGL: renderer: llvmpipe (LLVM 13.0.1 256 bits) v: 4.5 Mesa 22.0.5

lshw -c display

       description: VGA compatible controller
       product: GA102 [GeForce RTX 3090]
       vendor: NVIDIA Corporation
       physical id: 0
       bus info: pci@0000:65:00.0
       version: a1
       width: 64 bits
       clock: 33MHz
       capabilities: pm msi pciexpress vga_controller bus_master cap_list rom
       configuration: driver=nvidia latency=0
       resources: irq:67 memory:d7000000-d7ffffff memory:c0000000-cfffffff memory:d0000000-d1ffffff ioport:b000(size=128) memory:c0000-dffff
       product: EFI VGA
       physical id: 2
       logical name: /dev/fb0
       capabilities: fb
       configuration: depth=32 resolution=800,600


❯ sudo prime-select query
❯ sudo prime-select nvidia
Error: no integrated GPU detected.


# nvidia-xconfig: X configuration file generated by nvidia-xconfig
# nvidia-xconfig:  version 515.65.07

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"

Section "Files"

Section "InputDevice"

    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"

Section "InputDevice"

    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"

Section "Monitor"
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "Unknown"
    Option         "DPMS"

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "NVIDIA GeForce RTX 3090"

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    SubSection     "Display"
        Depth       24

ldconfig -p | grep GL (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/ (libc6,x86-64) => /lib/x86_64-linux-gnu/ (libc6) => /lib/i386-linux-gnu/
Cache generated by: ldconfig (Ubuntu GLIBC 2.35-0ubuntu3.1) stable release version 2.35

Error about x11

libGL error: MESA-LOADER: failed to open swrast: /usr/lib/dri/ cannot open shared object file: No such file or directory (search paths /usr/lib/x86_64-linux-gnu/dri:\$${ORIGIN}/dri:/usr/lib/dri, suffix _dri)
libGL error: failed to load driver: swrast

You already said, you’re connecting to a virtual Xserver so it should be obvious you’re not connecting to the nvidia gpu.
You need to either connect to the real Xserver using e.g. x11vnc/vnc0server or use VirtualGL.

1 Like

Thanks for replying! I used to think that even though the screen/display is virtual, the value in virtual frame buffer can still be updated by GPU, the fact proves me wrong.
But now I wonder how the headless rendering in OpenGL (nvidia’s imepletation) works, because it seems that x11 cant create context without a real screen/display.
However, it may far from this issue.
Thanks again!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.