Wayland Support for nvidia-settings?

So please take this in the most genuinely positive way possible, because that’s how I mean it:

You have NO business owning a fanless RTX 4060. Honestly? You have no business buying an RTX 4060 period, especially in 2024, especially when we are already at a crossover point of +50% of new AAA releases needing more than 8GB of VRAM to be PLAYABLE at 1080. But forget all that for a minute.

If you’re invested in this whole thing because you’re trying to upgrade your build which is to include a PASSIVELY COOLED RTX 4060, which doesn’t even exist as any market product so you could only obtain one through a customs shop or doing it yourself, Which means you’ll be paying 4070 Super prices in the streets for an RTX 3060 in the sheets, since a 4060 is basically a 3060, and a 4060 Ti is actually worse than a 3060 Ti. EVERY SINGLE HARDWARE ANALYST AND REVIEWER WHO has any respect among the community regulary makes it a point to remind their audience that buying a ROG Strix 4070 is objectively clown behavior because you could buy a Tuf Gaming 4070 Ti Super for a modicum of a price increase and get an entire full tier of better performance. These are also the same people who called the 4060 a waste of sand, a sobriquet usually reserved for the most embarrassing products of their decades.

Immediately, sort your priorities and grasp of how the real world is as it currently stands. Then, based on your honest plans for what you’ll do most with this computer, choose whichever parts out of the motherboard, CPU, RAM, PSU. case. cooler and storage are necessary to upgrade (Motherboard and CPU are definitely not an option to leave behind to age into decepitness.

Then, take whatever games you need to play, and measure their performance at 1080p at the refresh rate you plan to use.

I mean to be honest, your “upgrade” plans included a passively cooled 4060, so let’s not kid ourselves, you’re doing zero Machine Learning or any other AI tasks requiring CUDA/CUDNN/PyTorch, etc. By far the best choice for you is going to be a 7800X3D and a 7800XT GPU (or a 7950XT when it goes on firesale.

Even if somehow the bright red flashing sign screaming “THE 4060 IS OBJECTIVELY A BAD PRODUCT AND YOU SHOULDN’T BUY IT NO MATTER HOW YOU COOL IT” managed to r/whoosh you completely, here I am coming out and saying it. And everyone else has said it to. Gamers Nexus were furious and said it was a waste of sand, HW Unboxed said no one should buy it, hell even LTT didn’t even bother to have Linus read the review, Bell and James read it, and flat out said “we’re just as tired of this shit as you guys are.”

I don’t know what GPU you have right now, but no matter what you upgrade to, you will not have a silent build. You can’t, and the only way you can is to spend 7950X+3090 prices on a 14600 and a 6650 XT, because all their money is going to a futile war against noise and they give up performance.

Before I go, lest you think I hate Nvidia or something, look who founded this post, yeah me, because I wanted to be able to fully control my 3090 in Wayland that I’d camped out of Micro Center for 26 hours before launch day. L

This sounds horrifying.

Wow, dead giveaway at what a fantastically cakewalk life you’ve had, to where you think that’s even more than a mild annoyance.

Do you know what dude is talking about exactly? What it even entails?

CTRL+ALT+F4, log in, startx, and in the tiled window, type gwe and hit enter and then CTRL+ALT+F2 (if you’re on Plasma, I believe GNOME is CTRL ALT F7 or F1 or whatever they need to feel special this week.

It is ACTUALLY impossible to measure ANY amount of resource-usage that is even noticeable, let alone anything actually measurable as a loss in ANY category.

I used Wayland with my 3090 for about a month straight before going back to X because Wayland isn’t even a complete display server so idk why we’re all acting like it is, and I had ZERO issues with thermals, and I never even had to think about it because of stuff like, y’know… scripting? Autostart? And this was well over a year ago, back when Nvidia couldn’t even be argued to be approaching parity in Wayland with AMD.

What WAS horrifying was buying a 5600 XT on launch morning in January 2020, and then when it arrived I instantly threw it in and kicked myself as I realized Pop OS nor Ubuntu would boot because it’s new AMD hardware plus static distros, it doesn’t matter if god cured Canonical’s god complex and AMD"s compulsion to shoot themselves in the foot every time they have a chance to take dominance over GPU. But oh well, I run Arch anyway and have for 6 years so I booted that up, and thank god I was more of a tinkerer back then, because I had to use TKG’s repos for mesa-git, the linux kernel to force the latest release candidate at all times, several -git and master builds of other packages plus a manual copying of the current upstream repository of the files making up the linux-firmwarepackage. Then Doom Eternal came out like a month and a half later, and I was ecstatic. I sat in the proton github thread, and nothing. Nvidia users could play the game better than on Windows (a fact still true today, DE outruns Windows on Linux when using NV GPUs). But with AMD it was nothing, but then after a little while, a miracle! AMDVLK was working? At literally 35% of Windows performance. RADV didn’t catch up until that fall. But they were distracted, because of dreaded issue #892 on the gitlab.freedesktop.org/drm/amd/issues/892 thread, which caused outrageously high numbers of Linux users with RDNA 1 GPUs to face several-times-daily full driver resets ending in a hard crash, and this wasn’t during heavy gaming, and it DEFINITELY wasn’t in some specific game where the geniuses behind RADV (because AMD don’t develop one line of it). Problem was confirmed Linux only. Confirmed not due to individual hardware defects. Confirmed to happen on all 4 existing models of desktop RDNA 1 at the time, the 5500 XT, 5600 XT, 5700, and 5700 XT. By June I’d become so desperate I hoped that maybe I’d luck out if I sold my 5600 XT on eBay to someone who’d have no issues with it on Windows and then my 5700 XT showed up and it was the exact same. This wes 2.5-3 months before the suspected announcment of Ampere, with RDNA 2 close behind, and I’d been around enough to know which sources knew what was up, and I even said on r/Nvidia in June "Obviously the flagship 3090/3080Ti is going to destroy the flagship RDNA 2 in Ray Tracing, but Big Navi will win in Rasterization. It did. And I knew it would. And I’d spent half a decade on Linux ONLY using AMD GPUs because of the non-stop endless cacophony of misinformation and exaggeration (and straight lies), and I STILL drove 3.5 hours at 4AM 29 hours before launch day, making it 4th in line, and we didn’t find out until late afternoon after the last truck ran that day for the launch the next morning that they’d gotten a total of 10 with 150 people around the block.

I drove home, installed the NV drivers, ripped out that shit 5700 XT, installed the 3090, which worked perfectly because as I’d already just found out, I had day one drivers. And overclocking. And fan control. Shit AMD makes you wait 9 months or a year for.

If ten seconds of absent-minded effort is “horrifying” to you, you should REALLY be on a console.

I was being sarcastic. Please calm down.

you have a problem

1 Like

holy autism/fanboyism

1 Like

In the time that has passed since this thread was started, you people could have forked nvidia-settings and did this yourself.

Impressive comment :)
FWIW, I finished my fanless RTX 4060 upgrade this weekend and I’m really happy with it. The use of the linux pc is mainly for photography (darktable), not a gaming rig. The result is fantastic, 5 times more performant for image processing workloads than the good old 1650 I had before. The machine is built using a very old hdplex H5 case with the GPU heatsink kit, it has 8 heatpipes connecting the copper block to the side of the chassis which acts as a heatsink essentially. 0db. I tested it with gaming workloads too of course, solid 1080p gaming. at 70 degrees. On idle, 34 degrees and 2-3 watts.
I power-limited the card to 90W, undervolted it and lowered the boost frequency a bit to protect the chip.
The 4060 is ideal for this use case, as it’s the lowest power chip of the ada series but very decently powerful all considered. I agree it should be cheaper for what it is but that’s a wish more than anything else, still well worth it.
The only thing that is still not nice is the lack of full support of wayland of the nvidia driver. They made progress but still not there. Time for nvidia to get their act together on this, it’s been too many years now.

Someone lock this thread. Please.

With X11 you had a single server, Xorg, to be precise, so nvidia-settings could work everywhere all the time.

With Wayland you’ve got two dozen incompatible Wayland servers with completely different configuration systems.

Good luck porting this application to Wayland.

Yes, thank you reiterating the common knowledge that Linux is not and never will be a stable platform. We can lock this thread now.

1 Like