I have a Gen1 Thinkpad Extreme which contains a 1050ti and an Intel UHD 630. I have the proprietary nvidia drivers installed. Some things seem to work but other dont. This is the output from nvidia-smi:
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 530.41.03 Driver Version: 530.41.03 CUDA Version: 12.1 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce GTX 1050 T... Off| 00000000:01:00.0 Off | N/A |
| N/A 40C P8 N/A / N/A| 6MiB / 4096MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=======================================================================================|
| 0 N/A N/A 4029 G /usr/lib/xorg/Xorg 4MiB |
+---------------------------------------------------------------------------------------+
I can use the nvidia card by using prime-run on glxinfo and see that opengl works:
ash@sparky:~$ prime-run glxinfo | head -n 10
name of display: :0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4
server glx extensions:
GLX_ARB_context_flush_control, GLX_ARB_create_context,
GLX_ARB_create_context_no_error, GLX_ARB_create_context_profile,
GLX_ARB_create_context_robustness, GLX_ARB_fbconfig_float,
GLX_ARB_multisample, GLX_EXT_buffer_age,
I can decode video using mpv and the --hwdec=nvdec command no problem to decode video using the nvidia HW acceleration.
vkcube works without any problem using the nvidia gpu.
The issue I have is I cannot get VDPAU to work and cannot figure out what I’m missing. vdpauinfo returns the following:
vdpauinfo
display: :0 screen: 0
Failed to open VDPAU backend libvdpau_va_gl.so: cannot open shared object file: No such file or directory
Error creating VDPAU device: 1
Note I removed libvdpau_va_gl otherwise the translation layer kicks in and uses the Intel GPU.
Am I missing something obvious here? Should vdpauinfo automatically detect the nvidia card?
Also I noticed that nvidia-settings looks like this:
nvidia-settings
(nvidia-settings:37109): GLib-GObject-CRITICAL **: 08:55:08.689: g_object_unref: assertion 'G_IS_OBJECT (object)' failed
** Message: 08:55:08.751: PRIME: Requires offloading
** Message: 08:55:08.751: PRIME: is it supported? yes
** Message: 08:55:08.784: PRIME: Usage: /usr/bin/prime-select nvidia|intel|on-demand|query
** Message: 08:55:08.784: PRIME: on-demand mode: "1"
** Message: 08:55:08.784: PRIME: is "on-demand" mode supported? yes
This is what the gui looks like:
Whereas if I switch the computer to discrete graphics I get all the usual nvidia settings I’d expect, not just a summary of the card. Is it normal to only see the card summary when using prime?
So as you can see, I’m very confused. HW accel and direct rendering do work, but VDPAU doesn’t work, and so I can’t use certain tools I want to use. Any help would be greatly appreciated.
Here is the bug report:
nvidia-bug-report.log (951.9 KB)