PRIME doesn't work at all with AMD gpu

Alright, so I installed Arch Linux for my PC with RX 570 and RTX 3060. The monitor is plugged into the AMD card and is used by default for most applications, but I want to use RTX for games on steam. So, I want to off-load the rendering for steam apps for the Nvidia GPU, but keep the video output of it from the AMD card. I tried:

  • prime-run
  • __NV_PRIME_RENDER_OFFLOAD_PROVIDER=NVIDIA-G0 __GLX_VENDOR_LIBRARY_NAME=nvidia
  • __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia
    The thing is, even when using these options all of the apps still run on AMD card. I used nvtop to measure GPU usage and on Nvidia it’s 0 and the memory usage is almost 0 too (6 mb).

So, is there any way to use one card for specific apps and the main one for default (browser, desktop etc.)?

nvidia-bug-report.log.gz (282.9 KB)

1 Like

I can’t see anything wrong in the logs so it’s a bit odd it doesn’t work.
Please post the output of
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxinfo |grep vendor
to check for glx availability.
Also, you can run
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxgears
and run nvidia-smi in another terminal windows and check if glxgears appears in the process list.

1 Like

For __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxinfo |grep vendor it’s:

server glx vendor string: NVIDIA Corporation
client glx vendor string: NVIDIA Corporation

And when I use __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxgears the process in nvidia-smi is visible and in nvtop there is a spike from 0% usage to around 15% for RTX and 2 MB usage for the process.
The weird thing is that if I try __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia with %command% for, for example CS;GO in steam, there is visible process in nvidia-smi, but it takes the same amount of memory as glxgears, but this time there is no spike in usage and all the processing is visible on Radeon. I tried with a few different apps, but the result is the same in steam.

EDIT: I don’t know at this point if it’s the issue with driver itself or maybe steam.

I don’t think you can control those variables with games in Steam…Steam forks a process, it doesn’t spawn a command shell to do it IIRC. So, processes which take flags will receive them via “int main(arglist…)” but env vars will go nowhere.

You can launch steam itself with the env vars set. I also want to say when I last experimented with this (about 6 mos ago) there was an update pending for Steam to more smartly use render offload automatically.

This works very well in Steam, setting
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia %command%
in the game’s lauch options like you already did. I’m using this ever since render offload was working and I don’t think I’m the only one.
Maybe this is the same bug some users reported:
https://forums.developer.nvidia.com/t/performance-regression-on-prime-system-with-520-and-later-driver-series/241178
Did you try downgrading the driver?

I haven’t tried gamescope. The problem is quite serious. The latest working version is 515, which is not supported in many distributions, since it is not a stable branch

  1. Install the necessary software: You need to have the Nvidia proprietary driver and the PRIME synchronization software installed on your system. You can install them by following the instructions provided by your Linux distribution’s documentation.
  2. Set up PRIME synchronization: You need to configure the PRIME synchronization to offload the rendering to the Nvidia GPU. To do this, open a terminal and type the following command:

cssCopy code

xrandr --listproviders

This command will list the available display providers on your system. Take note of the provider number of your AMD GPU.Then, type the following command to set the Nvidia GPU as the offload renderer:

phpCopy code

xrandr --setprovideroffloadsink <nvidia_provider_number> <amd_provider_number>

Replace <nvidia_provider_number> and <amd_provider_number> with the provider numbers of your Nvidia and AMD GPUs, respectively.
3. Launch the application: To launch the application on the Nvidia GPU, use the prime-run command followed by the application’s command. For example:

arduinoCopy code

prime-run steam

This will launch Steam on the Nvidia GPU while keeping the video output on the AMD card.

Proton uses vulkan, like the latest csgo. Perhaps the problem is this.
I advise you to install globally.

VK_LOADER_DRIVERS_SELECT=“nvidia*”

There is also a variable from nvidia, __VK_LAYER_NV_optimus=NVIDIA_only

You can install it too, although the previous one is enough.