Getting "Could not get EGL display connection" Error

**• Hardware Platform (Jetson / GPU) : Jetson
**• DeepStream Version : 6.0.1
**• JetPack Version (valid for Jetson only) : 4.6.1

Hi, I applied all the suggestion in No protocol specified, No EGL Display, nvbufsurftransform: Could not get EGL display connection topic, but i still get the same error.

I am trying to use CSI camera in my Deepstream app.This error shows continually:

No EGL Display
nvbufsurftransform: Could not get EGL display connection
No protocol specified
No EGL Display
nvbufsurftransform: Could not get EGL display connection

(gst-plugin-scanner:63): GStreamer-WARNING **: 14:34:30.279: Failed to load plugin ‘/usr/lib/aarch64-linux-gnu/gstreamer-1.0/deepstream/’: cannot open shared object file: No such file or directory
No protocol specified
No EGL Display
nvbufsurftransform: Could not get EGL display connection
No protocol specified
No EGL Display

Thank You.

  1. Is there a physical monitor connected with your Jetson board?
  2. Where and how did you run the application? From ubuntu desktop or from remote terminal(E.G. ssh terminal)? Inside docker or directly in the host?
  3. Which application are you running? If you are using your own application, can you provide the full pipeline?

1- There is a physical monitor connected with jetson, yes.

2- I am running the application in docker container. Tried to add some timeout to code and setting DISPLAY=:0 it did not worked. My docker container creating command is this :

xhost + && docker run -ti --gpus all --net=host --ipc=host --device=/dev/video0 --device=/dev/video1 --privileged=true -e NVIDIA_DRIVER_CAPABILITIES=“compute,video,utility,display” -v /home/xxx/xxx/xxx:xxx -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/tmp/.X11-unix -v /home/xxx/.xxx/xxx:/xxx/***

3- I am creating pipeline with these functions , both of them does not work:

def create_csi_bin(self,source_info):

    source_bin = self._gst_helper._create_gst_bin(f"source-bin-{}")
    """nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=640,height=480,framerate=30/1' ! nvvidconv ! 'video/x-raw, format=I420' ! queue ! autovideosink"""

    csisrc = self._gst_helper._create_gst_elem("nvarguscamerasrc", f"csicam-{}")

    csisrc.set_property("sensor-id", 0)

    capsfilter = self._gst_helper._create_gst_elem("capsfilter", f"capsfilter-{}")
    capsfilter.set_property("caps", Gst.Caps.from_string('video/x-raw(memory:NVMM),width=640,height=480,framerate=30/1'))

    # videoconvert to make sure a superset of raw formats are supported
    videoconvert = self._gst_helper._create_gst_elem("nvvidconv", f"videoconvert-{}")
    capsfilter2 = self._gst_helper._create_gst_elem("capsfilter2", f"capsfilter2-{}")
    capsfilter2.set_property("caps", Gst.Caps.from_string("video/x-raw, format=I420"))

    # nvvideoconvert to convert incoming raw buffers to NVMM Mem (NvBufSurface API)
    queue = self._gst_helper._create_gst_elem("queue", f"queue-{}")
    # nvvideoconvert.set_property(
    #     'compute-hw', self.nvvideoconvert_computehw)
    # nvvideoconvert.set_property(
    #     'nvbuf-memory-type', self.nvvideoconvert_memtype)
    queue.set_property("silent", 1)

    capsfilter3 = self._gst_helper._create_gst_elem("capsfilter3", f"capsfilter3-{}")

    capsfilter3.set_property("caps", Gst.Caps.from_string("video/x-raw(memory:NVMM)"))

    # Add all elements that is created above to the GST Bin
    Gst.Bin.add(source_bin, csisrc)
    Gst.Bin.add(source_bin, capsfilter)
    Gst.Bin.add(source_bin, videoconvert)
    Gst.Bin.add(source_bin, capsfilter2)
    Gst.Bin.add(source_bin, queue)
    Gst.Bin.add(source_bin, capsfilter3)

    # Link elements in the bin

    # We get src pad of the capsfilter (the last element on decoding process),
    # and creates src ghostpad for the bin that is binded to created src pad.
    capsfilter3_src_pad = self._gst_helper._get_gst_static_pad(capsfilter3, "src")
    self._gst_helper._create_new_gst_ghost_pad(source_bin, "src", capsfilter3_src_pad)
        f"Created USB bin for {source_info.url} with name {source_bin.get_name()} for source {}"
    return source_bin

def create_csi_bin(self,source_info):
    """nvarguscamerasrc ! 'video/x-raw(memory:NVMM),width=640,height=480,framerate=30/1' ! nvvidconv ! 'video/x-raw, format=I420' ! queue ! autovideosink"""
    identity_sink_name = f"identity-sink-{}"
    if self.params.is_jetson:
        gst_elements = [
            f"nvarguscamerasrc sensor-id=0",
            f"'video/x-raw, format=I420'",
            f"identity name={identity_sink_name} silent=1",

    gst_bin_str = " ! ".join(gst_elements)
    source_bin = Gst.parse_launch(gst_bin_str)

    identity_element = self._gst_helper._get_gst_elem(source_bin, identity_sink_name)
    identity_src_pad = self._gst_helper._get_gst_static_pad(identity_element, "src")
    self._gst_helper._create_new_gst_ghost_pad(source_bin, "src", identity_src_pad)
        f"Created HTTP bin for {source_info.url} with name {source_bin.get_name()} for source {}"
    return source_bin 

And i tried nvoverlaysink, fakesink and autovideosink ( in different module they are linking) it did not work.

Any ideas ?

Where and how you get the docker “”? Can you try our formal docker “”?

I can’t connect to the link (i think it is broken) and i don’t know how to get the docker (cdcr) but it is good, everyone in the company (and i) can get it but in this specific situation it is problematic.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This is the introduction of the dockers: Docker Containers — DeepStream 6.2 Release documentation

This is the docker link: DeepStream | NVIDIA NGC

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.