Restream nveglglessink screen

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) : GPU
• DeepStream Version : 6.3
• TensorRT Version : 8.5.3-1+cuda11.8
• NVIDIA GPU Driver Version (valid for GPU only) : 4070 Ti
Hi,
Please help me to re stream the nveglglessink output to rtsp link. Here’s the pipeline:

gst_bin_add_many(GST_BIN(pipeline), queue1, pgie, queue2, tracker, queue3, sgie1, queue4,
                         sgie2, queue5, nvdslogger, queue6, nvvidconv, queue7, nvosd, queue8, sink, NULL)

where sink is defined as:

sink = gst_element_factory_make("nveglglessink", "nvvideo-renderer"); 

How to restream the video to rtsp? Any help is highly appreciated.
@yuweiw

You can refer to our source code deepstream\sources\apps\apps-common\src\deepstream_sink_bin.c.

static gboolean
create_udpsink_bin (NvDsSinkEncoderConfig * config, NvDsSinkBinSubBin * bin)

piepline:

......nvosd->nvvideoconvert->encoder->codecparse->rtppay->udpsink

Hi @yuweiw. Thanks for your time. Can you please tell me which gstreamer element is used for encoder and codecparse?
Also, for rtppay and udpsink should i use rtph264pay and udpsink element?

Could you refer to the source code I attached deepstream\sources\apps\apps-common\src\deepstream_sink_bin.c? All the plugins are created in this source code.

Hi.
So the source code has a function:

/**
 * Function to create sink bin to generate encoded output.
 */
static gboolean
create_encode_file_bin (NvDsSinkEncoderConfig * config, NvDsSinkBinSubBin * bin)

What parameters to give while I call the function in my main? For example:

encoder = create_encode_file_bin(????)

what parameters should I pass for NvDsSinkEncoderConfig * config and NvDsSinkBinSubBin * bin?

In theory, you don’t need to care about this. But if you want to learn more, all of our deepstream-app is open source. You need to read the code completely.

By referring to this code, you can learn how the following plugins are created.

bin->encoder = gst_element_factory_make ("nvv4l2h264enc", "any_name_you_defined");
bin->codecparse = gst_element_factory_make ("h264parse", "any_name_you_defined");
bin->rtppay = gst_element_factory_make ("rtph264pay", "any_name_you_defined");
bin->sink = gst_element_factory_make ("udpsink", "any_name_you_defined");

Hi. Thank you for you time. However, my code is structured as below in app.cpp file:

/* Use nvinfer or nvinferserver to run inferencing on decoder's output,
     * behaviour of inferencing is set through config file */
    CREATE_GIE_INSTANCE(pgie, pgie_type, "primary-nvinference-engine");
    CREATE_GIE_INSTANCE(sgie1, sgie1_type, "secondary1-nvinference-engine");
    CREATE_GIE_INSTANCE(sgie2, sgie2_type, "secondary2-nvinference-engine");

    /* We need to have a tracker to track the identified objects */
    tracker = gst_element_factory_make("nvtracker", "tracker");

    /* Use nvdslogger for perf measurement. */
    nvdslogger = gst_element_factory_make("nvdslogger", "nvdslogger");

    /* Use convertor to convert from NV12 to RGBA as required by nvosd */
    nvvidconv = gst_element_factory_make("nvvideoconvert", "nvvideo-converter");

    /* Create OSD to draw on the converted RGBA buffer */
    nvosd = gst_element_factory_make("nvdsosd", "nv-onscreendisplay");

How would I integrate the function create_udpsink_bin (NvDsSinkEncoderConfig * config, NvDsSinkBinSubBin * bin) from deepstream_sink_bin.cpp in the app.cpp file?

You don’t need to integrate this function.

After creating them, you can just link them like I attached before.

Hi. Thank you so much for your time. I’ve made the code as such:

     /* Create OSD to draw on the converted RGBA buffer */
     nvosd = gst_element_factory_make("nvdsosd", "nv-onscreendisplay");

    /* Create Sink*/
    // sink = gst_element_factory_make("fakesink", "nvvideo-renderer");
    // sink = gst_element_factory_make("nveglglessink", "nvvideo-renderer"); // for display
    // sink = gst_element_factory_make("fakesink", "nvvideo-renderer");

    nvvidconv1 = gst_element_factory_make("nvvideoconvert", "nvvidconv1");
    filter4 = gst_element_factory_make ("capsfilter", "filter4");
    caps4 = gst_caps_from_string ("video/x-raw, format=I420");
    g_object_set (G_OBJECT (filter4), "caps", caps4, NULL);
    gst_caps_unref (caps4);

    /*encoder*/
    x264enc = gst_element_factory_make ("x264enc", "h264 encoder");
    g_object_set (G_OBJECT (x264enc), "preset-level", 1, NULL);
    g_object_set (G_OBJECT (x264enc), "insert-sps-pps", 1, NULL);
    g_object_set (G_OBJECT (x264enc), "bufapi-version", 1, NULL);

    /*parser*/
    parse = gst_element_factory_make ("h264parse", "h264-parser2");
    rtppay = gst_element_factory_make ("rtph264pay", "rtp-payer");
    
    /*udp sink*/
    sink = gst_element_factory_make ("udpsink", "udp-sink");
    g_object_set (G_OBJECT (sink), "host", "127.0.0.1", "port",
                  udp_port, "async", FALSE, "sync", 1, NULL);

and I’m linking the elements as:

gst_bin_add_many(GST_BIN(pipeline), queue1, pgie, queue2, tracker, queue3, sgie1,
                    queue4, sgie2, queue5, nvvidconv, queue7, nvosd, queue8, nvvidconv1,
                    filter4, x264enc, parse, rtppay, sink, NULL);
    /* we link the elements together */
    if (!gst_element_link_many(streammux, queue1, pgie, queue2, tracker, queue3, sgie1,
                    queue4, sgie2, queue5, nvvidconv, queue7, nvosd, queue8, nvvidconv1,
                    filter4, x264enc, parse, rtppay, sink, NULL));
    {
        g_printerr("Elements could not be linked. Exiting.\n");
        return -1;
    }
/* Set the pipeline to "playing" state */
    g_print("Using file: %s\n", argv[1]);
    gst_element_set_state(pipeline, GST_STATE_PLAYING);
    start_rtsp_streaming (8554/*rtsp_port*/, udp_port, 0);

However, I’m getting the following error after compiling:

Elements could not be linked. Exiting.

How can I link the items properly?

Could you check if all your plugins has been created successfully first?

  if (!<your_plugin>) {
    g_printerr ("The plugin could not be created. Exiting.\n");
    return -1;
  }

Hi. I have a checker as follows:

if (!pgie || !tracker|| !sgie1 || !sgie2 || !sgie2 || !streammux || !nvvidconv || !nvosd ||
        !nvvidconv1 || !filter4 || !caps4 || !x264enc || !parse || !rtppay || !sink)
    {
        g_printerr("One element could not be created. Exiting.\n");
        return -1;
    }

What I mean is that you need to go through it one by one and figure out which plugin cannot be created.

  if (!<one_plugin>) {
    g_printerr ("The plugin could not be created. Exiting.\n");
    return -1;
  }

Hi. I did it as you said like below for each of the elements:

/* Create OSD to draw on the converted RGBA buffer */
    nvosd = gst_element_factory_make("nvdsosd", "nv-onscreendisplay");
    if (!nvosd)
    {
        g_printerr("nvosd could not be created. Exiting.\n");
        return -1;
    }

However the error is only:

(ANPR_SPD:2195935): GLib-GObject-WARNING **: 10:58:41.340: g_object_set_is_valid_property: object class 'GstNvTracker' has no property named 'enable_batch_process'
Elements could not be linked. Exiting.

Is there any issue while linking the elements?

Could you refer to the pipeline I attached before and remove the filter4 plugin? You can try removing some plugins yourself to see which one is causing the problem.

Hi. I will try and let you know!! Thank you

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.