Character models are sometimes invisible in The Witcher game

Original bug report is here:

It seems like bug in nvidia drivers, because with other graphics cards (Intel or AMD) it looks fine.

Here is example screenshot:

I think all users using 331.38 driver . Did you test with 331.89 ?

I was almost sure that update won’t solve this issue, but I installed 331.89 version. The bug still occurs.

Having this on 343.13+WINE(CSMT-git) too, but over the years I had it on and off without a clear pattern, sometimes I thought it’s based on VCruntimes, no idea, by pure luck it seems sometimes it just works ok.

Users are saying it happens with bumblebee, nvidia-prime. Is it also reproduce with just single nvidia gpu and driver without bumblebee, nvidia-prime ? Also plz attach nvidia bug report .

To me it happens on single nvidia gpu, over the years, GTS250, GTX460 and now GTX660Ti.
I launched the game for about 40 times today w/o any pattern emerging.
Basically you run it and maybe you get lucky if not maybe you run it again with GLXOSD and then it works or not, oh, I have some texture replacements, maybe that is it, they’re DDS files, move them and it works, this must be it, restart the PC, start the game, oh it’s broken again, put the texture replacements back, start the game, and now it works ok, restart the PC, it does not. Rinse and repeat.

Attached 2 log files:

  1. reboot, run the game with wine witcher.exe, game broken, generate report
  2. read the bug report above for 5 minutes, started the game with WINEDEBUG=-all glxosd wine ./witcher.exe 2>&1 (suppress error reports, use GLXOSD for fps display), game is fixed, generate report
    [This file was removed because it was flagged as potentially malicious] (204 KB)
    nvidia-bug-report-w1-textureissuefixed.log.gz (204 KB)

Internally we are tracking this issue in bug Bug 200040802 . We will first duplicate it and investigation to see its really nvidia driver bug or not.

Thanks for the logs. I don’t have standalone nvidia card - I can only use it through bumblebee/prime.

Do you know something new about this bug?

I’ve been experiencing this bug on native steam games too (not only witcher) . I’ve tested on both Nvidia driver 343.22 and 340.46 where only this regression only occurred on 343.22 drivers only. I’ve also tested this on Ubuntu 14.04 and Manjaro 0.8.10.

Manjaro 0.8.10
Linux 3.14.21-1-MANJARO (x86_64)
Bumblebee 3.2.1
Intel HD 4000
Nvidia GT 730M

Done using primus backend.

Star Conlict



As you can see, only certain models are affected.

What are you reasons for believing that this is the same problem? Do both games use the same engine or similar?
Please open a separate topic to discuss this specific issue.

Sorry for the confusion. Must have bump an old thread by mistake. The game engine used in this game is completely different and its a in-house made game engine. I’ll try to get some more information about this bug and post a new topic once I’ve have confirm it with the developers

Although we have managed to reproduce the problem internally a few times, we’re still struggling to find a reliable pattern of actions/configuration options to reproduce the problem easily. If you find out new information about how to more reliably reproduce the problem, please share it.

As mentioned in the wine bug, it happens all the time to people using primus/optirun. The bug does occur when running an application directly on the nvidia card (via directly attached to the device display port) but much more rarely. It also occurs with nvidia prime setup but much more rarely.


You need a dual video card machine where the nvidia card renders through bumblebee in order to get the bug every time. Otherwise it is a hit or a miss.

we have investigated this issue and found out the cause of the problem.
An OpenGL vertex shader (generated by Wine based on a D3D shader in the Witcher) is created with the following uniform:
uniform vec4 vs_c[256];

The shader then dereferences vs_c, with “x” a dynamically computed value that sometimes exceeds 255. Reading outside of the bounds of a uniform array in GLSL is undefined behavior, which means that the implementation is allowed to do whatever it likes. On NVIDIA GPUs, the value returned will sometimes be 0.0, and sometimes different.
Since this value is used to warp the bones of the characters (to animate them), having an unpredictable value can result in some vertices being offscreen and even some geometry disappearing entirely.

The problem is not a NVIDIA bug, since the application is asking the GPU to do something that is undefined in the specification. We have been working with Wine on the investigation of this issue, and have also communicated our findings to the developers of the game.
A patch on the Wine bug tracker appears to work around the issue.

Over a year is passed and the bug is still present with bumblebee driver, through wine/playonlinux with Witcher in primusrun mode.

Is there any manual fix we can do? Using intel integrated card there is no issue, only when nvidia is on.