Debian trixie on Lenovo Legion 15ACH6H. AMD iGPU + nvidia RTX3060: RTX fails with driver 545.29.02

Hello,
On my laptop, an Legion 15ACH6H with AMD Ryzen 7 5800H CPU and a Nvidia RTX360 GPU the GPU fails on 3D with 545.29.02 divers.
It fails in dual GPU configuration. When disabling AMD GPU all seems OK.

The fault depends of the application. With vkcube, I have:
Selected GPU 0: NVIDIA GeForce RTX 3060 Laptop GPU, type: DiscreteGpu
vkcube: ./cube/cube.c:1409: demo_prepare_buffers: Assertion `!err’ failed.
[1] 12695 IOT instruction (core dumped) vkcube

When launching glxinfo :
name of display: :0
X Error of failed request: BadAlloc (insufficient resources for operation)
Major opcode of failed request: 149 (GLX)
Minor opcode of failed request: 5 (X_GLXMakeCurrent)
Serial number of failed request: 0
Current serial number in output stream: 31

nvidia-bug-report.log.gz (1.4 MB)
nvidia-bug-report-without-igpu.log.gz (1.5 MB)

I’ve attached two nvidia-bug-reports : On with the two GPU enabled, one with igpu disabled.

Slightly messy. Did you copy over the OS from another laptop having an intel cpu? You are now running a Wayland session but were also running an Xorg session previously. For Xorg, please delete
/etc/X11/xorg.conf.d/20-displaylink.conf
/etc/X11/xorg.conf.d/nvidia.conf

There doesn’t seem to be an amd vulkan implementation for your igpu installed, which will be primary when running wayland, the nvidia gpu being accessible through offloading in Xwayland.
Does running
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia vkcube
work?

Hello,
Yes, it might be quite messy :)
I’m using Linux since Kernel 1.1/1.3 (in 1994) so I’m used to do some “tuning”.
The system was installed from scratch (on Debian 12) then updated to Debian 13 (Trixie). So it was not “copied” from an intel system.

I’ve tried a lot of solutions to have the AMD iGPU working fine with the nvidia dGPU. But the best solution I’ve found was to force Wayland usage instead of X.org.
In branch 535 of the drivers all is quite fine. But in 545 I’ve the bug described.

I’ve removed the x.org config files listed, rebooted and no changes. As said, It works fine with iGPU disabled.
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia vkcube fails exactly the same way.

Forcing using AMD iGPU with vkcube (vkcube --gpu_number 1, as nvidia GPU is number 0) works fine (on dual gpu configuration).

I’ve tried with “vkcube-wayland” that forces the wayland implemntation of vkcube.
Same result with a slight different error :

__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia vkcube-wayland --gpu_number 0                                                                                                                   ─╯
Selected GPU 0: NVIDIA GeForce RTX 3060 Laptop GPU, type: DiscreteGpu
vkcube-wayland: ./cube/cube.c:4039: demo_init_vk_swapchain: Assertion `!err' failed.
[1]    9881 IOT instruction (core dumped)  __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia vkcube-wayland