X Error if using xvimagesink and screen resolution changes

When using a gstreamer pipeline with xvimagesink and the screen resolution is changed (EDID changes or the hdmi output is unplugged) the pipeline exits with an X Error like the following:

X Error of failed request: BadAlloc (insufficient resources for operation)
Major opcode of failed request: 148 (XVideo)
Minor opcode of failed request: 19 ()
Serial number of failed request: 18964450
Current serial number in output stream: 18964450

I have determined that this does not happen if not using compiz but I need to use compiz to remove tearing artifacts in the video.


What is the purpose of plugging out monitor when rendering?? X needs to re-init if EDID changes.

I am using xrandr to reconfigure X if the monitor changes (using a udev rule). My app needs to stay functioning if someone decides to plug into a different monitor.

Hi merwin,

Have you tried nvoverlaysink? Does this hit the error as well?

The failure does not happen with nvoverlaysink which I can’t use because I need the output to be in a window.

The failure also does not happen with nveglglessink but I’m seeing about 60 percent of the performance of xvimagesink which is already less fps than I want.

Actually the problem does happen with nveglglessink, just not as often.

Does this also happens when using non-gstreamer app? For example, running openGL sample and modify the resolution.

When using the FXAA openGL sample program, a resolution change using xrandr works ok, but an HDMI hot-plug causes the FXAA window to stop rendering (all black) until I resize the window and then it recovers.

When using gstreamer with nveglglessink, a resolution change using xrandr or a hot-plug causes the output window to stop rendering (last image remains).

When using gstreamer with xvimagesink, a resulution change using xrandr or a hot-plug causes the original BadAlloc error.

I would assume you need a rendering context…as soon as the screen goes away the context becomes invalid. I doubt plugging a monitor back in would recreate the same context ID…even if it did the time span during which there is no context should result in an error anyway. Monitors are hot-plug when using HDMI, but OpenGL is ignorant of any concept of hot-plug. A simple change of resolution does not destroy the context.

It seems that if you want to survive monitor hot-plug you need to render to a virtual desktop which always runs, and view through remote desktop software. Although people think of a virtual server and remote desktop software as running on two different computers there is no reason they cannot both run on the same computer.