X server loses a screen when switching machines on DisplayPort KVM switch

Hey Everyone,

At work, we are trying to solve a display issue where we can’t get any X screens to consistently stay up when we switch to and from a monitor. I’ll try to draw it out below…

|Machine A| |Machine B|
\ /
\ /
\ _________ /
|DPort KVM|
|_________|
/
/
/
|Monitor 2| |Monitor 1|

What we try to do is reboot Machine A and have X server start while the KVM is pointed at Machine B. I think of it as a pseudo-Headless boot. I believe that I am using the ConnectedMonitor and UseDisplayDevice options correctly to force the displays through the KVM. I am also using the EDID information that nvidia-settings created for me.

Now, the problem… If we are pointed at or away Machine A when we boot, there is no problem and we get the exact server layout that we want (Perfect!). Unfortunately, if we flip from Machine A to Machine B then back to Machine A, we lose one of our displays (Monitor 2). Checking out the log file, it seems that the switch from A → B and B → A are recognized and the display information is reloaded but the monitor is not awake.

To me, it seems like an issue with the KVM “Hotplugging” the Displays. I would love to disable the Hotplugging Events, but the documentation says you can’t do that for DP :)

The work around for now is using xrandr commands to turn the monitors off then back on, but that isn’t a solution we would like to use.

Please let me know if I can provide any other info. I can’t seem to find the attach button on the thread creation page so I’ll try to add my logs in a separate reply.

System info:
OS: RHEL 7.3
Driver: Nvidia 375.66 downloaded from Nvidia site
Monitors: Dell UP3017 (DisplayPort)
Graphics Card: Nvidia NVS 310
nvidia-bug-report.log.gz (85.7 KB)
Xorg.0.log (30.2 KB)
xorg.conf.txt (3.28 KB)

DisplayPort is an active protocol that requires negotiation between the source and sink devices, so it doesn’t lend itself well to the ConnectedMonitor / UseDisplayDevice / CustomEdid flow that works for other display connections. In particular, I suspect that ConnectedMonitor is forcing the driver to consider the monitor as present, but when it tries to negotiate the available bandwidth, it finds that there isn’t any.

Rather than trying to fake the monitor, I would recommend going the other route and treating it as fully dynamic. I.e. use the AllowEmptyInitialConfiguration option to allow Machine A’s X server to start with no connected display, and then use a hotplug-aware desktop environment such as GNOME to allow it to respond to the KVM switch by dynamically enabling or disabling that display.