And apparently with no working DSC?
It’s a Samsung RU8000 65” which again has absolutely no problem outputting 40k 60hz on my AMD mini PC.
This is kinda off topic so might be a good idea to move this elsewhere or something.
And apparently with no working DSC?
It’s a Samsung RU8000 65” which again has absolutely no problem outputting 40k 60hz on my AMD mini PC.
This is kinda off topic so might be a good idea to move this elsewhere or something.
Mehhh, it’s a ghost town over here lol
Is “Input Signal Plus” enabled on the hdmi port ?
I can check later, but I see no reason why that setting would change behavior based on GPU vendor.
That setting is what enables the full HDMI bandwidth of the TV. I’m not saying it’s the cure for your discrepancy, but since your TV is capable of 4K 60Hz 4:4:4 or 1440p 120Hz 4:4:4, you need to have that option enabled.
My theory is that if “Input Signal Plus” is disabled on the TV, your AMD mini PC is outputting 4K 60Hz 4:2:2, while the NVIDIA card is outputting 4K 30Hz 4:4:4. That makes sense to me.
Hi,
Tried comparing the 580 and 590 drivers on a 7800X3D + RTX 4080.
It looks like there’s some regression.
Sorry screens are not 1:1 but you get the picture…
Alan Wake 2 — 1440p, DLSS Balanced, all settings max, path tracing set to Low.
580
590
Same for Black Myth: Wukong — 1440p, DLSS Quality, all settings max, path tracing set to High.
580
590
Interesting, I might try this to confirm.
I suggest you post this in respective driver’s feedback thread: 590 release feedback & discussion
And best to make the feedback clear and quick to understand, something like:
Major performance regression on Alan Wake 2
580: 62 fps
590: 36 fps
Then include settings, attach screenshots, etc. Ideally, a bug report would be good to include, too.
NVIDIA people (and others) are trying to filter the useful info out of a mess of spam posts, so posting in the most clear and concise way will increase the odds of this getting picked up and fixed sooner. :)
Instead of fixing the bad performance they make it twice as bad. What a shitshow…
Just be glad it isn’t worse like GTK being broken for the third or fourth time now.
That is really bad lmao. Tuck, I thought It would improve, just by a little, well I guess another bug is introduced lmao. Time to set up a windows VM to play games I guess.
>NVIDIA people (and others) are trying to filter the useful info out of a mess of spam posts, so posting in the most clear and concise way will increase the odds of this getting picked up and fixed sooner. :)
dude I don’t get paid a single cent for it yet I’m perfectly able to keep up with the bug reports, have you not realized the latest driver release is “BETA” yet we still have 0 feedback from the devs?
have you not realized how 80% of bug reports are basically ignored from them, and even when they finally give a second of attention to your issue, you have to post reminders every few days to have any news about it (when they even bother replying)?
The fact it took them a WHOLE year to simply answer to this thread (that has been constantly at the very top of this forum) doesn’t ring a bell?
While I’m no less frustrated with Nvidia’s attitude, sending a proper bug report in a proper thread as @catt suggested seems to be the only way there is to have non-zero chances of it being noticed by Nvidia engs (as opposed to posting in this thread).
Is it exactly the same scene? Because GPU usage approximately reflects the gap between the libraries, 82% and 98%, also a ~20% gap.
That said the reason why this is happening is already known and were detailed by Faith Ekstrand from Collabora at the XDC2025. It will require a rewrite of Vulkan extensions. Separately she is working on NVK and Nova kernel drivers. So hopefully the gap will be filled at some point.
Is it exactly the same scene? Because GPU usage approximately reflects the gap between the libraries, 82% and 98%, also a ~20% gap.
Yes?
Embark seems to have majorly broken something these last few updates because the FPS and GPU utilization absolutely tanks whenever you play a real game, but these results are similar to my 4060 in percentage difference between the two APIs.
I’m not convinced DX11 performance generally is right either.
That said the reason why this is happening is already known and were detailed by Faith Ekstrand from Collabora at the XDC2025. It will require a rewrite of Vulkan extensions. Separately she is working on NVK and Nova kernel drivers. So hopefully the gap will be filled at some point.
What will happen first, DX12 performance being fixed or the year of the Linux desktop?
Ppl are talking at serious conferences that they are actively working on the new Vulkan spec and they’ve announced specific estimates (March-April if I recall), so it’s unlikely not to happen sometime next year.
I think you are also too pessimistic regarding Linux desktop: not because distros / desktop-env-projects are doing a good job (please kill me), but because at this point it seems like M$ is actively trying to decrease its OS’s market share and not many ppl are able to afford Apple’s overpriced sh*t in this economy ;-]
Is it me, but I notice a better performance already using 590 vs 580. They list “The NVIDIA 590 Linux driver has also improved the performance for re-creating Vulkan swapchains, which can become noticeable with stuttering when resizing Vulkan application windows that in turn is now resolved.” and I am wondering if this is part of it. My games feel smoother, but I would need to benchmark again to be sure this is not placebo. That said I am not the first to report it.
What distro are you using?
Recreating swapchains is not something that happens regularly during rendering, only as a reaction to window changes (resizing, in some cases alt-tabbing, color space changes, etc.), so that specifically wouldn’t help with overall performance.