Prime-run: Cannot create window (X_GLXCreateContext)

Hello I am trying to run a game called “Unnamed SDVX clone” using prime-run ./usc-game but I get the following error:

[22:34:19][Info] Starting task "Application Setup"
[22:34:19][Info] Version: 0.4.0
[22:34:19][Info] Git commit: 2021-03-29_cb259d0
[22:34:19][Info] The locale was changed from C to C
[22:34:19][Info] Starting task "Font library initialization"
[22:34:19][Info] Finished task "Font library initialization" in  25 ms
[22:34:19][Info] Starting task "Creating Window"
X Error of failed request:  BadValue (integer parameter out of range for operation)
  Major opcode of failed request:  151 (GLX)
  Minor opcode of failed request:  3 (X_GLXCreateContext)
  Value in failed request:  0x0
  Serial number of failed request:  88
  Current serial number in output stream:  89

uname -a: Linux gentbox 5.11.17-gentoo-x86_64 #8 SMP Sun Jun 6 14:50:17 EEST 2021 x86_64 AMD Ryzen 5 4600H with Radeon Graphics AuthenticAMD GNU/Linux
GPU: NVIDIA GeForce GTX 1650 Ti Mobile

I also attached nvidia-bug-report.log.gz
How can I fix this issue?
nvidia-bug-report.log.gz (1.4 MB)

In Xorg.0.log, it doesn’t look like the X server is even attempting to load the NVIDIA driver, but I’m not sure why not since the right autoconfig files are there.

In Xorg.0.log.old, it does load the driver and it seems like it should be working.

In the failing configuration, what output do you get from xrandr --listproviders? It should show a provider named NVIDIA-G0 if the server loaded the correct PRIME configuration.

This is what I get:

Providers: number : 1
Provider 0: id: 0x54 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 1 associated providers: 0 name:Unknown AMD Radeon GPU @ pci:0000:05:00.0

When I run lsmod | grep nv I get this:

$ lsmod | grep nv
nvidia_drm             57344  1
nvidia_modeset       1142784  1 nvidia_drm
nvidia              34459648  1 nvidia_modeset
drm_kms_helper        282624  2 amdgpu,nvidia_drm
drm                   663552  14 gpu_sched,drm_kms_helper,amdgpu,drm_ttm_helper,nvidia_drm,ttm
i2c_core              102400  10 i2c_designware_platform,videodev,i2c_hid,i2c_designware_core,drm_kms_helper,i2c_algo_bit,nvidia,amdgpu,i2c_piix4,drm
nvme                   49152  1
nvme_core             126976  2 nvme
t10_pi                 16384  1 nvme_core

Yeah, the X server isn’t loading the NVIDIA driver. Please try removing /etc/X11/xorg.conf and anything in /etc/X11/xorg.conf.d – the autoconfig files in /usr/share/X11 are supposed to take care of setting up a PRIME render offload configuration automatically.

If that still doesn’t help, please try removing the nvidia-drm.modeset=1 kernel parameter to see if that’s interfering with the autoconfig somehow.

I removed the configs and now I get this:

[23:01:59][Info] Starting task "Application Setup"
[23:01:59][Info] Version: 0.4.0
[23:01:59][Info] Git commit: 2021-03-29_cb259d0
[23:01:59][Info] The locale was changed from C to C
[23:01:59][Info] Starting task "Font library initialization"
[23:01:59][Info] Finished task "Font library initialization" in  28 ms
[23:01:59][Info] Starting task "Creating Window"
[23:01:59][Warning] No joysticks found
[23:01:59][Info] Finished task "Creating Window" in  131 ms
[23:01:59][Info] Starting task "Audio Init"
[23:01:59][Info] Audio driver [0]: pulseaudio
[23:01:59][Info] Audio driver [1]: alsa
[23:01:59][Info] Audio driver [2]: disk
[23:01:59][Info] Audio driver [3]: dummy
[23:01:59][Info] Using audio driver: pulseaudio
[23:01:59][Info] Audio device [0]: Dummy Output
[23:01:59][Info] Finished task "Audio Init" in  3 ms
[23:01:59][Info] Starting task "GL Init"
[23:01:59][Error] Failed to create OpenGL context: Invalid window
[23:01:59][Error] Failed to create OpenGL context
[23:01:59][Info] Finished task "GL Init" in  0 ms
[23:01:59][Info] Finished task "Application Setup" in  168 ms
[23:01:59][Info] Starting task "Application Cleanup"
[23:01:59][Info] Finished task "Application Cleanup" in  88 ms

xrandr --listproviders now also shows the nvidia GPU
Running through the code of the game, that “Invalid Window” is by SDL_GetError(), but this seems to be with the game itself, although it runs on integrated graphics

Do more basic apps such as glxgears work with prime-run in this configuration? If so it does seem like the app is doing something weird.

it does, imma try to git pull, maybe they fixed it

Is that ? I can try to find some time to give this a try and see if I can tell whether the driver or the game is doing something wrong.

yes it is
Some files that may be of interest are:

Thanks for taking your time of the day to help me with this!

I spent some time stepping through this and the problem seems to be in the glXChooseVisual request that SDL makes here:

(gdb) bt
#0  glXChooseVisual (dpy=0x555556176f00, screen=0, attrib_list=0x7fffffffdbc0)
#1  0x00007ffff7f1259c in X11_GL_GetVisual (_this=0x555556176110, display=0x555556176f00, screen=0) at /home/aaron/git/SDL/src/video/x11/SDL_x11opengl.c:610
#2  0x00007ffff7f11779 in X11_GL_InitExtensions (_this=0x555556176110) at /home/aaron/git/SDL/src/video/x11/SDL_x11opengl.c:346
#3  0x00007ffff7f11425 in X11_GL_LoadLibrary (_this=0x555556176110, path=0x7ffff7f790a2 "") at /home/aaron/git/SDL/src/video/x11/SDL_x11opengl.c:238
#4  0x00007ffff7e06e34 in SDL_GL_LoadLibrary_REAL (path=0x0) at /home/aaron/git/SDL/src/video/SDL_video.c:3086
#5  0x00007ffff7e019c2 in SDL_CreateWindow_REAL (title=0x7fffffffdf20 "USC-Game", x=805240832, y=805240832, w=1280, h=720, flags=34) at /home/aaron/git/SDL/src/video/SDL_video.c:1500
#6  0x00007ffff7d11917 in SDL_CreateWindow (a=0x7fffffffdf20 "USC-Game", b=805240832, c=805240832, d=1280, e=720, f=34) at /home/aaron/git/SDL/src/dynapi/SDL_dynapi_procs.h:542
#7  0x00005555558bd73f in Graphics::Window_Impl::Window_Impl (this=0x5555561715f0, outer=..., size=..., sampleCount=2 '\002') at ../Graphics/src/Window.cpp:78
#8  0x00005555558bc49d in Graphics::Window::Window (this=0x555556170e90, size=..., samplecount=2 '\002') at ../Graphics/src/Window.cpp:614
#9  0x00005555555957c2 in Application::m_Init (this=0x555556084170) at ../Main/src/Application.cpp:886
#10 0x00005555555916bb in Application::Run (this=0x555556084170) at ../Main/src/Application.cpp:117
#11 0x000055555572b554 in main (argc=1, argv=0x7fffffffe3c8) at ../Main/src/Main.cpp:23

In the attrib list for this particular call, it’s asking for a visual with multisample:

(gdb) x/17wd attrib_list
0x7fffffffdbc0:	4	8	3	9
0x7fffffffdbd0:	3	10	2	11
0x7fffffffdbe0:	8	5	12	16
0x7fffffffdbf0:	13	2	100001	2
0x7fffffffdc00:	0

(that decodes as RGBA, red >= 3, green >= 3, blue >= 2, alpha >= 8, double buffered, depth >= 16, stencil >= 2, samples >= 2)

Unfortunately, when running in render offload mode the NVIDIA GLX driver has to piggyback off of the X11 visuals provided by the host X screen because it’s not able to inject its own into a foreign X screen’s visual list. In order to make that more likely to work, the NVIDIA driver creates many GLXFBConfigs that share a single GLXVisual in order to minimize the number of visuals required from the host screen. In this particular case, the GLXFBConfigs with multisample share visuals with the one without multisample, and the one without are the ones presented via the legacy GLX 1.2 glXChooseVisuals function.

In order to make this code more reliable, SDL should be using the GLX 1.3 glXChooseFBConfig function when available.

So from what I understand, is that this isn’t a problen with the game or driver themselves, but with SDL. Surely there’s a bug report for this. But this happens with opengl right? What if vulkan was used instead? Would this cause the same problem?

I don’t think Vulkan would run into this but I’m not familiar with how SDL sets up Vulkan.

I have a patch to SDL to make it try glXChooseFBConfig if available and fall back to glXChooseVisual if it’s not that I’ll get cleaned up and ready to review. I confirmed that it fixes the problem with Unnamed SDVX Clone.

(Side note: I hadn’t heard of SDVX before. This game is crazy!)

Nice work! You should try and make a pull request on github when you can or attach the .patch file so I can patch it myself. Thanks for everything!

Sorry for the delay. Pull request here: x11: Use glXChooseFBConfig when available in X11_GL_GetVisual by aaronp24 · Pull Request #4440 · libsdl-org/SDL · GitHub

1 Like

Nice! I will compile your fork as soon as possible!

Did you ever get a chance to try this? It would be good to know if it doesn’t work before Icculus looks at it. :)