Alright, so I installed Arch Linux for my PC with RX 570 and RTX 3060. The monitor is plugged into the AMD card and is used by default for most applications, but I want to use RTX for games on steam. So, I want to off-load the rendering for steam apps for the Nvidia GPU, but keep the video output of it from the AMD card. I tried:
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia
The thing is, even when using these options all of the apps still run on AMD card. I used nvtop to measure GPU usage and on Nvidia it’s 0 and the memory usage is almost 0 too (6 mb).
So, is there any way to use one card for specific apps and the main one for default (browser, desktop etc.)?
I can’t see anything wrong in the logs so it’s a bit odd it doesn’t work.
Please post the output of __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxinfo |grep vendor
to check for glx availability.
Also, you can run __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxgears
and run nvidia-smi in another terminal windows and check if glxgears appears in the process list.
And when I use __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia glxgears the process in nvidia-smi is visible and in nvtop there is a spike from 0% usage to around 15% for RTX and 2 MB usage for the process.
The weird thing is that if I try __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia with %command% for, for example CS;GO in steam, there is visible process in nvidia-smi, but it takes the same amount of memory as glxgears, but this time there is no spike in usage and all the processing is visible on Radeon. I tried with a few different apps, but the result is the same in steam.
EDIT: I don’t know at this point if it’s the issue with driver itself or maybe steam.
I don’t think you can control those variables with games in Steam…Steam forks a process, it doesn’t spawn a command shell to do it IIRC. So, processes which take flags will receive them via “int main(arglist…)” but env vars will go nowhere.
You can launch steam itself with the env vars set. I also want to say when I last experimented with this (about 6 mos ago) there was an update pending for Steam to more smartly use render offload automatically.
I haven’t tried gamescope. The problem is quite serious. The latest working version is 515, which is not supported in many distributions, since it is not a stable branch
Install the necessary software: You need to have the Nvidia proprietary driver and the PRIME synchronization software installed on your system. You can install them by following the instructions provided by your Linux distribution’s documentation.
Set up PRIME synchronization: You need to configure the PRIME synchronization to offload the rendering to the Nvidia GPU. To do this, open a terminal and type the following command:
cssCopy code
xrandr --listproviders
This command will list the available display providers on your system. Take note of the provider number of your AMD GPU.Then, type the following command to set the Nvidia GPU as the offload renderer:
Replace <nvidia_provider_number> and <amd_provider_number> with the provider numbers of your Nvidia and AMD GPUs, respectively.
3. Launch the application: To launch the application on the Nvidia GPU, use the prime-run command followed by the application’s command. For example:
arduinoCopy code
prime-run steam
This will launch Steam on the Nvidia GPU while keeping the video output on the AMD card.