Why do we need to rely on X to control fan speeds?

Who in the right mind decided that I cannot change the fan speeds of my graphics cards unless I have an X server running on every one of them, with a monitor attached? Is this a joke?

I have multiple GeForce cards and I use them for compute workloads, using a Linux system. One workload that comes to mind is Blender. I want to be able to control ALL of my graphics cards without the need to screw around with an x config file. I used to be able to create multiple X screens and use a virtual display to be able to control all my GeForce cards, and the latest driver and Xorg updates have stopped me from doing that.

Can someone give some insight on how I can achieve fan control on all of my GPU? Why is there no option to control my fan speeds through a sysfs device? Why can I control the LED on all of my Geforce cards yet I cannot control my fan speeds unless an X server is running on all of them?

For years this has been a constant issue with NVIDIA consumer graphics cards and I’ve had to spend literally hours bodging together Xorg config files just to get a basic feature working, and the latest driver and X org updates have completely broken this.

This is what my X config looked like before the update that took away the ability to control multiple GPU:

# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 440.44

Section "ServerLayout"
    Identifier     "Layout0"
    Screen      0  "Screen0" 0 0
    Screen      1  "Screen1" RightOf "Screen0"
    InputDevice    "Keyboard0" "CoreKeyboard"
    InputDevice    "Mouse0" "CorePointer"
#    Option         "Xinerama" "0"
EndSection

Section "Files"
EndSection

Section "Module"
    Load           "dbe"
    Load           "extmod"
    Load           "type1"
    Load           "freetype"
    Load           "glx"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Mouse0"
    Driver         "mouse"
    Option         "Protocol" "auto"
    Option         "Device" "/dev/psaux"
    Option         "Emulate3Buttons" "no"
    Option         "ZAxisMapping" "4 5"
EndSection

Section "InputDevice"
    # generated from default
    Identifier     "Keyboard0"
    Driver         "kbd"
EndSection

Section "Monitor"
    # HorizSync source: edid, VertRefresh source: edid
    Identifier     "Monitor0"
    VendorName     "Unknown"
    ModelName      "LG Electronics LG HDR QHD"
    HorizSync       230.0 - 230.0
    VertRefresh     120.0 - 144.0
    Option         "DPMS"
EndSection

Section "Monitor"
    Identifier     "Monitor1"
    VendorName     "Unknown"
    ModelName      "Unknown"
    Option         "DPMS"
EndSection

Section "Device"
    Identifier     "Device0"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "TITAN Xp"
    BusID          "PCI:3:0:0"
EndSection

Section "Device"
    Identifier     "Device1"
    Driver         "nvidia"
    VendorName     "NVIDIA Corporation"
    BoardName      "GeForce GTX 980 Ti"
    BusID          "PCI:4:0:0"
EndSection

Section "Screen"
    Identifier     "Screen0"
    Device         "Device0"
    Monitor        "Monitor0"
    DefaultDepth    24
    Option         "Stereo" "0"
#    Option         "nvidiaXineramaInfoOrder" "DFP-5"
    Option         "metamodes" "nvidia-auto-select +0+0 {AllowGSYNCCompatible=On}"
#    Option         "SLI" "Off"
#    Option         "MultiGPU" "Off"
    Option         "BaseMosaic" "off"
    Option         "AllowEmptyInitialConfiguration" "True"
    Option         "Coolbits" "28"
    SubSection     "Display"
        Depth       24
    EndSubSection
EndSection

Section "Screen"
    Identifier     "Screen1"
    Device         "Device1"
    Monitor        "Monitor1"
    DefaultDepth    24
    Option         "AllowEmptyInitialConfiguration" "True"
    Option         "Coolbits" "28"
    Option         "UseDisplayDevice" "none"
    SubSection     "Display"
        Virtual     1 1
        Depth       24
    EndSubSection
EndSection

So what am I supposed to do? Use Windows from now on, just for the sake of my GPU not dropping clocks, or do I need to buy a rack server and tesla cards (which don’t have their own fans and are cooled using the servers’)?