I am running X11 on Debian Trixie/Sid (kernel 6.12.9) on my desktop with a RTX 3090 GPU and a very old AMD Ryzen Threadripper 1950X CPU.
lsb_release -a
No LSB modules are available.
Distributor ID: Debian
Description: Debian GNU/Linux trixie/sid
Release: n/a
Codename: trixie
uname -a
Linux debian 6.12.9-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.12.9-1 (2025-01-10) x86_64 GNU/Linux
echo $XDG_SESSION_TYPE
x11
I have installed Nvidia 565 drivers according to the following guides:
Nvidia driver installation: NVIDIA Driver Installation Guide
Nvidia cuda installation: 1. Introduction — Installation Guide for Linux 12.8 documentation
The gist of the installation process is:
sudo apt-get install -V nvidia-open
sudo apt-get install -V cuda-drivers
sudo apt-get install -V cuda-toolkit
My Nvidia drivers seems to be working. nvidia-smi
prints normally, and here is my graphic settings:
inxi -Ga
Graphics:
Device-1: NVIDIA GA102 [GeForce RTX 3090] vendor: eVga.com. driver: nvidia
v: 565.57.01 alternate: nouveau,nvidia_drm non-free: 550.xx+ status: current
(as of 2024-09; EOL~2026-12-xx) arch: Ampere code: GAxxx
process: TSMC n7 (7nm) built: 2020-2023 pcie: gen: 1 speed: 2.5 GT/s
lanes: 8 link-max: gen: 4 speed: 16 GT/s lanes: 16 ports: active: none
off: HDMI-A-1 empty: DP-1,DP-2,DP-3 bus-ID: 42:00.0 chip-ID: 10de:2204
class-ID: 0300
Device-2: Logitech C922 Pro Stream Webcam driver: snd-usb-audio,uvcvideo
type: USB rev: 2.0 speed: 480 Mb/s lanes: 1 mode: 2.0 bus-ID: 3-4:3
chip-ID: 046d:085c class-ID: 0102 serial: DD741E8F
Display: x11 server: X.Org v: 21.1.15 with: Xwayland v: 24.1.4
compositor: gnome-shell v: 47.2 driver: X: loaded: fbdev,nouveau
unloaded: modesetting,vesa alternate: nv dri: swrast
gpu: nvidia,nvidia-nvswitch display-ID: :1 screens: 1
Screen-1: 0 s-res: 3840x2160 s-dpi: 96 s-size: 1016x572mm (40.00x22.52")
s-diag: 1166mm (45.9")
Monitor-1: HDMI-A-1 mapped: default note: disabled model: Dell S2817Q
serial: MTKT17AK960I built: 2017 res: 3840x2160 gamma: 1.2
diag: 708mm (27.9") ratio: 16:9 modes: max: 3840x2160 min: 640x480
API: EGL v: 1.5 hw: drv: nvidia platforms: device: 0 drv: nvidia device: 2
drv: swrast surfaceless: drv: nvidia x11: drv: swrast
inactive: gbm,wayland,device-1
API: OpenGL v: 4.6.0 compat-v: 4.5 vendor: mesa v: 24.3.3-1 glx-v: 1.4
direct-render: yes renderer: llvmpipe (LLVM 19.1.6 256 bits)
device-ID: ffffffff:ffffffff memory: 45.92 GiB unified: yes
The drivers seem to have accelerated video decoding (also confirmed by playing VLC and watching nvtop):
ffmpeg -encoders 2>/dev/null | grep nvenc
V....D av1_nvenc NVIDIA NVENC av1 encoder (codec av1)
V....D h264_nvenc NVIDIA NVENC H.264 encoder (codec h264)
V....D hevc_nvenc NVIDIA NVENC hevc encoder (codec hevc)
One weird thing about my Nvidia installation: even though I only have 1 GPU, the Nvidia drivers seem to be using Optimus management I think? At least I can’t launch firefox onto Nvidia without setting these env: __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia
, otherwise it would default to software CPU rendering with Mesa and LLVM drivers. My 3090 is the only GPU I am using, so ideally I want everything to run on my 3090 (including X11), but I can’t get it to work for some reason.
Setting __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia
also changes the inxi -Ga
output to avoid using Mesa/LLVM as the OpenGL API and instead uses the Nvidia drivers as the OpenGL API:
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia inxi -Ga
Graphics:
Device-1: NVIDIA GA102 [GeForce RTX 3090] vendor: eVga.com. driver: nvidia
v: 565.57.01 alternate: nouveau,nvidia_drm non-free: 550.xx+ status: current
(as of 2024-09; EOL~2026-12-xx) arch: Ampere code: GAxxx
process: TSMC n7 (7nm) built: 2020-2023 pcie: gen: 1 speed: 2.5 GT/s
lanes: 8 link-max: gen: 4 speed: 16 GT/s lanes: 16 ports: active: none
off: HDMI-A-1 empty: DP-1,DP-2,DP-3 bus-ID: 42:00.0 chip-ID: 10de:2204
class-ID: 0300
Device-2: Logitech C922 Pro Stream Webcam driver: snd-usb-audio,uvcvideo
type: USB rev: 2.0 speed: 480 Mb/s lanes: 1 mode: 2.0 bus-ID: 3-4:3
chip-ID: 046d:085c class-ID: 0102 serial: DD741E8F
Display: x11 server: X.Org v: 21.1.15 with: Xwayland v: 24.1.4
compositor: gnome-shell v: 47.2 driver: X: loaded: fbdev,nouveau
unloaded: modesetting,vesa alternate: nv gpu: nvidia,nvidia-nvswitch
display-ID: :1 screens: 1
Screen-1: 0 s-res: 3840x2160 s-dpi: 96 s-size: 1016x572mm (40.00x22.52")
s-diag: 1166mm (45.9")
Monitor-1: HDMI-A-1 mapped: default note: disabled model: Dell S2817Q
serial: MTKT17AK960I built: 2017 res: 3840x2160 gamma: 1.2
diag: 708mm (27.9") ratio: 16:9 modes: max: 3840x2160 min: 640x480
API: EGL v: 1.5 hw: drv: nvidia platforms: device: 0 drv: nvidia device: 2
drv: swrast surfaceless: drv: nvidia x11: drv: swrast
inactive: gbm,wayland,device-1
API: OpenGL v: 4.6.0 compat-v: 4.5 vendor: nvidia mesa v: 565.57.01
glx-v: 1.4 direct-render: yes renderer: NVIDIA GeForce RTX 3090/PCIe/SSE2
memory: 23.44 GiB
Anyway, I was trying to get firefox video decoding hardware acceleration to work. I installed nvidia-vaapi-driver=v0.0.13
, but I am having some issues with poor performance still. Launching Firefox with the recommended about:config
settings and some additional env I found online gives and EGL and VA-API error:
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia __VK_LAYER_NV_optimus=NVIDIA_only __EGL_VENDOR_LIBRARY_FILENAMES=/usr/share/glvnd/egl_vendor.d/10_nvidia.json LIBVA_DRIVER_NAME=nvidia LIBVA_DEVICE=/dev/dri/renderD128 VDPAU_DRIVER=nvidia NVD_BACKEND=direct NVD_GPU="/dev/dri/renderD128" MOZ_ENABLE_WAYLAND=0 MOZ_DISABLE_RDD_SANDBOX=1 MOZ_DRM_DEVICE=/dev/dri/renderD128 NVD_LOG=1 firefox
[GFX1-]: glxtest: libEGL no display
[GFX1-]: vaapitest: ERROR
[GFX1-]: vaapitest: VA-API test failed: failed to open renderDeviceFD.
So I don’t think the VA-API is working correctly. When I watch a video on Firefox, it’s choppy and while nvtop
does have Firefox running on the Nvidia GPU, it doesn’t show any dec
accelerated video decoding utilization (which it does when I watch a local video with VLC). This was the entire purpose of installing the Nvidia-VAAPI-driver.
So I installed vainfo
and tried to debug the VAAPI error, but vainfo
doesn’t work either. It picks up the Nvidia-VAAPI-driver, which tries to open a file but crashes:
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia NVD_LOG=1 LIBVA_DRIVER_NAME=nvidia VDPAU_DRIVER=nvidia vainfo
Trying display: wayland
Trying display: x11
libva info: VA-API version 1.22.0
libva error: vaGetDriverNames() failed with unknown libva error
libva info: User environment variable requested driver 'nvidia'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/nvidia_drv_video.so
libva info: Found init function __vaDriverInit_1_0
45673.974193197 [68826-68826] ../src/vabackend.c:2187 __vaDriverInit_1_0 Initialising NVIDIA VA-API Driver: 10
45673.974200250 [68826-68826] ../src/vabackend.c:2196 __vaDriverInit_1_0 Now have 0 (0 max) instances
45673.974210940 [68826-68826] ../src/vabackend.c:2222 __vaDriverInit_1_0 Selecting Direct backend
45674.006006810 [68826-68826] ../src/direct/direct-export-buf.c: 68 direct_initExporter Searching for GPU: 0 0 128
45674.006151956 [68826-68826] ../src/direct/direct-export-buf.c: 72 direct_initExporter Unable to find NVIDIA GPU 0
45674.006161053 [68826-68826] ../src/vabackend.c:2247 __vaDriverInit_1_0 Exporter failed
libva error: /usr/lib/x86_64-linux-gnu/dri/nvidia_drv_video.so init failed
libva info: va_openDriver() returns 1
vaInitialize failed with error code 1 (operation failed),exit
I modified the source code of direct-export-buf.c:72
(which belongs to the Nvidia-VAAPI-driver project) to dig further, and can confirm a few things:
- It tries to open
/dev/dri/renderD128
which is the correct file path - the file does exist.
stat()
returns the following struct:st_dev=6
,st_ino=944
,st_mode=8624
,st_uid=0
,st_gid=105
,st_rdev=57984
- when
open()
returnsfd == -1
, this giveserrno = 22
(Invalid argument) - note that this is not a permission issue. I confirmed with
getfacl
that my user has permission to read/write these files, and also, if this was a permission issue, then errno would return EACCESS Permission Denied.
Here is the file properties of /dev/dri/renderD128
:
ls -last /dev/dri/renderD128
0 crw-rw----+ 1 root render 226, 128 Jan 14 04:39 /dev/dri/renderD128
I searched online for Invalid Argument error and found this link: c - Possible reasons of linux open call returning EINVAL - Stack Overflow
I believe what is happening is that the /dev/dri/renderD128
file is created with some special nonstandard behavior, but the glibc library packaged with Debian 13 kernel 6.12.9-amd64 is not able to open()
it. So even though the /dev/dri/*
character device files exist and can be stat()
, they cannot be opened via the C function open()
.
Note that I couldn’t get X11 to open with the Nvidia driver and GPU (it is currently running on my CPU which is slow). Here is the X11 log:
cat /var/log/Xorg.0.log
[ 11.084] (--) Log file renamed from "/var/log/Xorg.pid-1538.log" to "/var/log/Xorg.0.log"
[ 11.085]
X.Org X Server 1.21.1.15
X Protocol Version 11, Revision 0
[ 11.085] Current Operating System: Linux debian 6.12.9-amd64 #1 SMP PREEMPT_DYNAMIC Debian 6.12.9-1 (2025-01-10) x86_64
[ 11.085] Kernel command line: BOOT_IMAGE=/boot/vmlinuz-6.12.9-amd64 root=UUID=b426f26c-1c7f-4d2a-a22e-ebf8b5d70339 ro nvidia-drm.modeset=1 pcie_aspm=off pcie_port_pm=off quiet
[ 11.085] xorg-server 2:21.1.15-2 (https://www.debian.org/support)
[ 11.085] Current version of pixman: 0.44.0
[ 11.085] Before reporting problems, check http://wiki.x.org
to make sure that you have the latest version.
[ 11.085] Markers: (--) probed, (**) from config file, (==) default setting,
(++) from command line, (!!) notice, (II) informational,
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
[ 11.085] (==) Log file: "/var/log/Xorg.0.log", Time: Tue Jan 14 00:42:51 2025
[ 11.085] (==) Using config file: "/etc/X11/xorg.conf"
[ 11.085] (==) Using config directory: "/etc/X11/xorg.conf.d"
[ 11.085] (==) Using system config directory "/usr/share/X11/xorg.conf.d"
[ 11.085] (==) ServerLayout "layout"
[ 11.085] (**) |-->Screen "nvidia" (0)
[ 11.085] (**) | |-->Monitor "<default monitor>"
[ 11.086] (**) | |-->Device "nvidia"
[ 11.086] (==) No monitor specified for screen "nvidia".
Using a default monitor configuration.
[ 11.086] (**) Allowing byte-swapped clients
[ 11.086] (==) Automatically adding devices
[ 11.086] (==) Automatically enabling devices
[ 11.086] (==) Automatically adding GPU devices
[ 11.086] (==) Automatically binding GPU devices
[ 11.086] (==) Max clients allowed: 256, resource mask: 0x1fffff
[ 11.086] (WW) The directory "/usr/share/fonts/X11/cyrillic" does not exist.
[ 11.086] Entry deleted from font path.
[ 11.086] (==) FontPath set to:
/usr/share/fonts/X11/misc,
/usr/share/fonts/X11/100dpi/:unscaled,
/usr/share/fonts/X11/75dpi/:unscaled,
/usr/share/fonts/X11/Type1,
/usr/share/fonts/X11/100dpi,
/usr/share/fonts/X11/75dpi,
built-ins
[ 11.086] (==) ModulePath set to "/usr/lib/xorg/modules"
[ 11.086] (II) The server relies on udev to provide the list of input devices.
If no devices become available, reconfigure udev or disable AutoAddDevices.
[ 11.086] (II) Loader magic: 0x557cfcca2f20
[ 11.086] (II) Module ABI versions:
[ 11.086] X.Org ANSI C Emulation: 0.4
[ 11.086] X.Org Video Driver: 25.2
[ 11.086] X.Org XInput driver : 24.4
[ 11.086] X.Org Server Extension : 10.0
[ 11.087] (++) using VT number 1
[ 11.088] (II) systemd-logind: took control of session /org/freedesktop/login1/session/c6
[ 11.090] (II) xfree86: Adding drm device (/dev/dri/card0)
[ 11.090] (II) Platform probe for /sys/devices/pci0000:40/0000:40:01.3/0000:42:00.0/drm/card0
[ 11.091] (EE) systemd-logind: failed to take device /dev/dri/card0: Invalid argument
[ 11.102] (--) PCI:*(66@0:0:0) 10de:2204:3842:3987 rev 161, Mem @ 0x9e000000/16777216, 0x80000000/268435456, 0x90000000/33554432, I/O @ 0x00003000/128, BIOS @ 0x????????/131072
[ 11.102] (II) LoadModule: "glx"
[ 11.102] (II) Loading /usr/lib/xorg/modules/extensions/libglx.so
[ 11.103] (II) Module glx: vendor="X.Org Foundation"
[ 11.103] compiled for 1.21.1.15, module version = 1.0.0
[ 11.103] ABI class: X.Org Server Extension, version 10.0
[ 11.103] (II) LoadModule: "nvidia"
[ 11.103] (II) Loading /usr/lib/xorg/modules/drivers/nvidia_drv.so
[ 11.104] (II) Module nvidia: vendor="NVIDIA Corporation"
[ 11.104] compiled for 1.6.99.901, module version = 1.0.0
[ 11.104] Module class: X.Org Video Driver
[ 11.104] (II) NVIDIA dlloader X Driver 565.57.01 Thu Oct 10 12:05:50 UTC 2024
[ 11.104] (II) NVIDIA Unified Driver for all Supported NVIDIA GPUs
[ 11.104] (EE) No devices detected.
[ 11.104] (EE)
Fatal server error:
[ 11.104] (EE) no screens found(EE)
[ 11.104] (EE)
Please consult the The X.Org Foundation support
at http://wiki.x.org
for help.
[ 11.104] (EE) Please also check the log file at "/var/log/Xorg.0.log" for additional information.
[ 11.104] (EE)
[ 11.201] (EE) Server terminated with error (1). Closing log file.
So to sum up:
/dev/dri/render128
and/dev/dri/card0
are both character device files (I’m assuming created by the Nvidia drivers)- Nvidia-VAAPI-driver crashes because it cannot use C’s
open()
to open/dev/dri/render128
, returning an Illegal Argument error - X11 crashes because it cannot open
/dev/dri/card0/
, returning an Illegal Argument error - Based on the stackoverflow link in my post, it seems like
open()
would return Illegal Argument only when the character device files were created with some property that the standard C library does not support (possibly some async file feature). (note: this is not a file permission issue as confirmed bygetfacl
)
As a result, certain GPU applications work (I can watch VLC with GPU accelerated decoding, I can train neural networks via CUDA), but some Nvidia clients like X11 and Nvidia-VAAPI-driver crash because they cannot open the /dev/dri/*
character device files.
I would like to have X11 run entirely on my Nvidia GPU with Nvidia drivers. I would also like to have Nvidia-VAAPI-driver working so that my Firefox uses GPU accelerated video decoding, otherwise videos are very choppy right now.
What is the best way to proceed with fixing these issues? Is there some other intended way to open these character device files in the 565 drivers instead of using C’s standard open()
? Perhaps with some sort of Nvidia library?
nvidia-bug-report.log.gz (1.2 MB)