Vulkan not working (SOLVED)

Fedora 36 with Gnome 42.2, everything up to date
Nvidia’s latest proprietary drivers (515.57)
Nvidia Geforce GTX 745 and Intel Core i5-6400 (no integrated graphics)

Problem first occurred when launching Harry Potter and the Goblet of Fire through Lutris (flatpak). I get a terrible framerate and upon launching Lutris (flatpak) through a terminal, I receive these errors:

libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
libEGL warning: DRI3: Screen seems not DRI3 capable
libEGL warning: DRI2: failed to authenticate
Error: couldn't find RGB GLX visual or fbconfig
2022-07-09 12:29:21,250: Invalid glxinfo received

And after 2022-07-09 12:29:21,644: Startup complete:

2022-07-09 12:31:05,686: Unable to load libGLX_nvidia.so.0
2022-07-09 12:31:05,686: Unable to locate libGLX_nvidia

As soon as I run the game, I get this error while the game continues to run at a terrible framerate: WARNING: lavapipe is not a conformant vulkan implementation, testing use only.

This seems like my Vulkan drivers are messed up, so I tried to run OMORI through Lutris (RPM) and instantly receive these errors and the game refuses to start:

Screenshot from 2022-07-09 12-23-30
Second image

Trying to launch anyway gives me the normal Gnome application crash message (“Oops! We’re sorry, it looks like OMORI crashed…”)

Here’s the output of glxinfo -B

    name of display: :0
    display: :0  screen: 0
    direct rendering: Yes
    Memory info (GL_NVX_gpu_memory_info):
        Dedicated video memory: 4096 MB
        Total available memory: 4096 MB
        Currently available dedicated video memory: 3883 MB
    OpenGL vendor string: NVIDIA Corporation
    OpenGL renderer string: NVIDIA GeForce GTX 745/PCIe/SSE2
    OpenGL core profile version string: 4.6.0 NVIDIA 515.57
    OpenGL core profile shading language version string: 4.60 NVIDIA
    OpenGL core profile context flags: (none)
    OpenGL core profile profile mask: core profile
    
    OpenGL version string: 4.6.0 NVIDIA 515.57
    OpenGL shading language version string: 4.60 NVIDIA
    OpenGL context flags: (none)
    OpenGL profile mask: (none)
    
    OpenGL ES profile version string: OpenGL ES 3.2 NVIDIA 515.57
    OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20

Here’s the output of vulkaninfo

    Cannot create Vulkan instance.
    This problem is often caused by a faulty installation of the Vulkan driver or attempting to use a GPU that does not support Vulkan.
    ERROR at /builddir/build/BUILD/Vulkan-Tools-sdk-1.3.204.0/vulkaninfo/vulkaninfo.h:649:vkCreateInstance failed with ERROR_INCOMPATIBLE_DRIVER

lspci -k shows that my GPU is using the “nvidia” driver like it’s supposed to.

I checked through /usr/share/vulkan/icd.d/ and /etc/vulkan/icd.d/ and both of them were empty. Could this be the problem? How could I get my graphics thingies working?

EDIT: SOLVED

Adding this to a new file in /usr/share/vulkan/icd.d/nvidia_icd.json fixed the issue:

{
    "file_format_version" : "1.0.0",
    "ICD": {
        "library_path": "libGLX_nvidia.so.0",
        "api_version" : "1.3.204"
    }
}

That API version I just randomly pulled from the vulkaninfo error text and it worked.

1 Like

Looks like your NVIDIA drivers lacked something in the first place. I did nothing and vulkaninfo works just fine here:

$ vulkaninfo 
WARNING: lavapipe is not a conformant vulkan implementation, testing use only.
==========
VULKANINFO
==========

Vulkan Instance Version: 1.3.204


Instance Extensions: count = 19
===============================
	VK_EXT_acquire_drm_display             : extension revision 1
	VK_EXT_acquire_xlib_display            : extension revision 1
	VK_EXT_debug_report                    : extension revision 10
	VK_EXT_debug_utils                     : extension revision 2
	VK_EXT_direct_mode_display             : extension revision 1
	VK_EXT_display_surface_counter         : extension revision 1
	VK_KHR_device_group_creation           : extension revision 1
	VK_KHR_display                         : extension revision 23
	VK_KHR_external_fence_capabilities     : extension revision 1
	VK_KHR_external_memory_capabilities    : extension revision 1
	VK_KHR_external_semaphore_capabilities : extension revision 1
	VK_KHR_get_display_properties2         : extension revision 1
	VK_KHR_get_physical_device_properties2 : extension revision 2
	VK_KHR_get_surface_capabilities2       : extension revision 1
	VK_KHR_surface                         : extension revision 25
	VK_KHR_surface_protected_capabilities  : extension revision 1
	VK_KHR_wayland_surface                 : extension revision 6
	VK_KHR_xcb_surface                     : extension revision 6
	VK_KHR_xlib_surface                    : extension revision 6

Layers: count = 5
=================
VK_LAYER_KHRONOS_validation (Khronos Validation Layer) Vulkan version 1.3.204, layer version 1:
	Layer Extensions: count = 3
		VK_EXT_debug_report        : extension revision 9
		VK_EXT_debug_utils         : extension revision 1
		VK_EXT_validation_features : extension revision 2
	Devices: count = 2
		GPU id = 0 (llvmpipe (LLVM 14.0.0, 256 bits))
		Layer-Device Extensions: count = 3
			VK_EXT_debug_marker     : extension revision 4
			VK_EXT_tooling_info     : extension revision 1
			VK_EXT_validation_cache : extension revision 1

		GPU id = 1 (NVIDIA GeForce GTX 1660 Ti)
		Layer-Device Extensions: count = 3
			VK_EXT_debug_marker     : extension revision 4
			VK_EXT_tooling_info     : extension revision 1
			VK_EXT_validation_cache : extension revision 1

Yeah, indeed, you must have had this file:

$ cat /etc/vulkan/icd.d/nvidia_icd.json
{
    "file_format_version" : "1.0.0",
    "ICD": {
        "library_path": "libGLX_nvidia.so.0",
        "api_version" : "1.3.205"
    }
}

If you have this file and you still need a file in /usr/share/vulkan/icd.d it means FlatPak doesn’t work correctly. Could warrant a bug report.

It did start working after I added that file with api_version 1.3.204, but after update I had to change it to 1.3.205 again. I hope it auto-updates or I might have to deal with this again in the future