Why do I need X11 from a docker container?

I wanna run the Jetson Nano headless with docker to gain 1-2FPS more and reduce my footprint, restart time, have more memory for docker, …

  1. Trying my app from headless Nano (without X11) I got 11-12FPS with openCV 4.3.0 and cuDNN support

  2. Trying my app from Nano with X11 I got 10-11 FPS

  3. Trying my app from a docker container with X11 I got 9-10 FPS

My question is why can I start my app native without X11 and if I use a docker container I need to pass the DISPLAY variable. Can I avoid to start X11 and go with set-default multi-user.target

The system is complaining to miss the EGL support can I use fakesink or another workaround for this?

If you’re using GStreamer/DeepStream, I find the nvoverlaysink works great without x11 running. I’m not sure if it works in a container, however. In such cases I use a RTSP solution. X11 + docker is just a bad idea, IMO.


You can use fakesink like this:

sudo docker run -it --rm --runtime nvidia -v /tmp/.X11-unix/:/tmp/.X11-unix [container]


Yes this is also my opinion

1 Like

Does it mean if I have:

   "nvarguscamerasrc sensor-id=%d ! " + \
    "video/x-raw(memory:NVMM), width=1280, height=720, format=NV12, framerate=%d/1 ! " + \
    "nvvidconv flip-method=%d ! " + \
    "video/x-raw, display_width=320, display_height=320, format=BGRx ! " + \
    "videoconvert ! video/x-raw, format=BGR ! " + \

It will work?

I am using also a CSI camera in your string there is no argus_socket, can I even get this out and why is the X11 directory bind if I am using fakesink? ATM I have a lot of questions…

You can turn off X and go straight “camera ! nvoverlaysink”. My recollection is it works fine. You can check the capabilities of each element with gst-inspect-1.0

For optimal performance you want NVMM from beginning to end of your pipeline. Standard GStreamer elements like videoconvert are not accelerated. Please see the user’s guide for a list of elements you can use. If you install the deepstream package there are even more plugins.

Here is the user’s guide for reference.

Thx a lot appreciated

To be clear, I’ve only tested that solution outside of a container. I’m not sure it would work inside. You can look at the deep stream reference app, for an example of how to make an RTSP sink, if you want to stream video from inside a container.