Headless and Non-Headless X-Server

I want to run an X-Server with one screen displaying on a physical monitor, and have another screen be operating headless. I am able to get headless mode working by adding <Option “UseDisplayDevice” "none”> to my xorg.conf, and then specifying a virtual size. The screen section of my xorg.conf for headless mode looks like this:

Section “Screen”
Identifier “Screen0”
Device “Device0”
Option “UseDisplayDevice” “none”
SubSection “Display”
Virtual 1920 1080
Depth 24
EndSubSection
EndSection

However, after I add <Option “UseDisplayDevice” “none”>, it causes all of the x-screens to become headless, and I would like to still output the other x-screen to a monitor. From this: Appendix B. X Config Options

“Additionally, the special value “none” can be specified for the “UseDisplayDevice” option. When this value is given, any programming of the display hardware is disabled. The NVIDIA driver will not perform any mode validation or mode setting for this X screen. This is intended for use in conjunction with CUDA or in remote graphics solutions such as VNC or Hewlett Packard’s Remote Graphics Software (RGS).”

It looks like setting “UseDisplayDevice” to “none” prevents me from using the physical outputs on the GPU. Is is possible to have a headless screen and use a physical monitor simultaneously? If so, how can it be configured?

When you don’t use the headless option then I assume the graphics displays on the primary screen.
How are you using your virtual secondary screen?

You can use the xorg options ConnectedMonitor and CustomEdid to fake a second monitor.

Yes, the graphics show on the primary screen when not using headless. I am rendering some opengl applications to the virtual secondary screen, and using NvFBC to grab the frame buffer into CUDA shared memory, and then copying portions of that shared memory to another application on the primary screen. This all works when I use two GPUs, one operating completely headless and another not headless. I want to do the same thing with only 1 GPU. A temporary hack around this is to just connect another monitor in place of the virtual screen, but that wastes 1 display output because that frame buffer gets copied to the primary screen as described above.

Would doing this use up one of the display outputs? E.g. if I have 4 DP outputs, can I use all 4 outputs, and have a fake monitor with a CustomEdid to act as my virtual screen?

I don’t know but I guess you can’t drive 5 screens with it.