Deepstream NvInfer error when trying to remove from pipeline (Error: -5)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson Nano 2GB
• DeepStream Version 5.1
• JetPack Version (valid for Jetson only) 4.6
• TensorRT Version 8.0.1
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) Question/Bug
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I have an application that runs two nvinfer elements back to back in an v4l2src sourced pipeline. One of the requirements of this application is to attach and detach nvinfer elements while the pipeline is running in order to enable and disable certain models during runtime. The code I have works sometimes but other times produces this error:

0:00:34.241139294  8833   0x559b1a5400 WARN                 nvinfer gstnvinfer.cpp:1984:gst_nvinfer_output_loop:<inference> error: Internal data stream error.
0:00:34.241669975  8833   0x559b1a5400 WARN                 nvinfer gstnvinfer.cpp:1984:gst_nvinfer_output_loop:<inference> error: streaming stopped, reason not-linked (-1)
RECIEVED EOS SIGNAL ON PROBE

It seems to be completely random where it will work for for a while with no problem but other times I’ll power up and it’ll start throwing this error from the start and will persist without ceasing.

I’m at a loss of what to do. I’ll provide the following code, this is what I’m using currently to remove and add the second nvinfer in the pipeline:

static GstPadProbeReturn event_probe_eos(GstPad * pad, GstPadProbeInfo * info, AppData* data)
{
    GstEvent *event = GST_PAD_PROBE_INFO_EVENT(info);
	if(GST_EVENT_TYPE (event) == GST_EVENT_EOS){
	    g_print("RECIEVED EOS SIGNAL ON PROBE\n");
        gst_pad_remove_probe (pad, GST_PAD_PROBE_INFO_ID (info));
        return GST_PAD_PROBE_DROP;
	}
}

static void add_nvinfer_to_pipeline(AppData* data){

    g_print("========================[ADDING AI]========================\n");

GstPad* pad = gst_element_get_static_pad(data->nv_infer, "src");
    gulong pad_id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, event_probe, NULL, NULL);

    GstPad* tiler_pad = gst_element_get_static_pad(data->nv_tiler, "sink");
    gst_pad_unlink(pad, tiler_pad);

    GstPad* infer_pad = gst_element_get_static_pad(data->nv_secondary_infer, "sink");
    GstPad* infer_pad_src = gst_element_get_static_pad(data->nv_secondary_infer, "src");

    gst_element_link_many(data->nv_infer, data->nv_secondary_infer, data->nv_tiler, NULL);

    gst_pad_remove_probe(pad, pad_id);

    gst_element_set_state (data->pipeline, GST_STATE_PLAYING);

}

static void remove_nvinfer_from_pipeline(AppData* data){

    g_print("========================[REMOVING AI]========================\n");

    gtk_widget_set_sensitive (data->start_stop_ai_button, FALSE);

    GstPad* pad = gst_element_get_static_pad(data->nv_infer, "src");
    if (!pad){
        g_print("failed to get src pad");
    }

    gulong pad_id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, event_probe_block, data, NULL);

    gst_object_unref(pad);

    GstPad* infer_pad = gst_element_get_static_pad(data->nv_secondary_infer, "sink");
    gst_pad_unlink(pad,infer_pad);

    GstPad* infer_pad_src = gst_element_get_static_pad(data->nv_secondary_infer, "src");
    GstPad* tiler_pad = gst_element_get_static_pad(data->nv_tiler, "sink");

    gulong pad_eos_id = gst_pad_add_probe(infer_pad_src, GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM, event_probe_eos, NULL, NULL);
    gst_pad_send_event (infer_pad, gst_event_new_eos ());
	
    gst_pad_unlink(infer_pad_src, tiler_pad);

    gst_pad_link(pad, tiler_pad);

    gst_pad_remove_probe(pad, pad_id);

    gst_element_set_state (data->nv_secondary_infer, GST_STATE_NULL);

    gst_element_set_state (data->pipeline, GST_STATE_PLAYING);

    gtk_widget_set_sensitive (data->start_stop_ai_button, TRUE);
}

I should note I’ve tried other solutions including following similar logic above however probing in front of the first nvinfer and then sending an eos through both nvinfers and catching the eos signal after the second nvinfer to ensure the data passes through. This gets rid of the error but crashes the pipeline as well in another way.

Another note: When this error occurs one of the Jetson CPU’s will go to 100% usage and remain there until the application is closed. The CPU that goes to 100% is random.

Any help would be much appreciated as I’ve been struggling with this error for weeks now and cant seem to figure it out.
Thank you very much.

To add and remove gst-nvinfer from pipeline dynamically is not in our design scope. Why you should do this? What is the scenario?

Thanks for your response.

The purpose is to toggle the AI on the pipeline. So in this case I’m removing the nvinfer element to disable that model then re-adding the nvinfer element to the pipeline in order to add that model back to the flow. This is triggered by a user clicking a STOP and START button on a GTK interface. This allows them to see the stream with AI on and the stream with the AI off. If there’s a better way to dynamically toggle an nvinfer element then I’d love to hear.

Thank you!

We do not support such scenario now.

Understood, thank you for the quick response.

On a side note, the probe with GST_PAD_PROBE_TYPE_BLOCK | GST_PAD_PROBE_TYPE_EVENT_DOWNSTREAM never gets hit. No matter what element I put this probe on in my pipeline the probe never calls back. However, if I set it BLOCK to IDLE it calls back and if I set DOWNSTREAM to UPSTREAM it calls back. So the probe works, but just not in the way I want it to. Am I missing something here?

Thanks

It has nothing to do with deepstream. Please refer to gstreamer document. Pipeline manipulation