Deepstream-test2 Internal data stream error

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 32.4.3

I am getting the following error when attempting to run deepstream-test2 with my own .engine files:

Frame Number = 0 Number of objects = 0 Vehicle Count = 0 Person Count = 0
0:00:05.386413157 23334   0x55729279e0 WARN                 nvinfer gstnvinfer.cpp:1975:gst_nvinfer_output_loop:<secondary1-nvinference-engine> error: Internal data stream error.
0:00:05.386464825 23334   0x55729279e0 WARN                 nvinfer gstnvinfer.cpp:1975:gst_nvinfer_output_loop:<secondary1-nvinference-engine> error: streaming stopped, reason error (-5)
ERROR from element secondary1-nvinference-engine: Internal data stream error.
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(1975): gst_nvinfer_output_loop (): /GstPipeline:dstest2-pipeline/GstNvInfer:secondary1-nvinference-engine:
streaming stopped, reason error (-5)
Returned, stopping playback
Deleting pipeline

Any suggestions on how to debug this would be greatly appreciated.

Hi,
Can you run success with original models? to rule out any environment issue.

Yes I am able to run the original models fine.

Is there any other information helpful for troubleshooting?

It’s caught here. gstnvinfer.cpp line 1975. when sgie output loop for generating tensor output processing run. can your sgie model run by trtexec?

      case GST_FLOW_NOT_NEGOTIATED:
        GST_ELEMENT_ERROR (nvinfer, STREAM, FAILED,
            ("Internal data stream error."),
            ("streaming stopped, reason %s (%d)", gst_flow_get_name (flow_ret),
                flow_ret));

Is this a bug in the deepstream library code or do I need to modify the application code to make this work?

Can you suggest how I would resolve this issue?

@Amycao The sgie model is the same as the pgie model. Running the pgie alone without sgie works fine.

It’s not a bug. it means the negotiation failed. when pad pushing return code for sgie submit input buffer processing does not equal pad pushing return code for sgie output loop processing return code, error caught. in your case.

So sgie model is a detector model. you can not use detector model as classifier model, they have different post processing. you can refer to back to back detectors for your case. deepstream_reference_apps/back_to_back_detectors.c at master · NVIDIA-AI-IOT/deepstream_reference_apps · GitHub

So sgie model is a detector model. you can not use detector model as classifier model, they have different post processing.

@Amycao Can you please elaborate on what this means?

@Amycao this is from the dstest2_sgie1_config.txt:

[property]
is-classifier=1
output-blob-names=predictions/Softmax
classifier-threshold=0.51

So the sgie is a classification model, no?

I think amycao is saying the sgie cannot be a obj detector or a mrcnn model. Though I am getting similar issues when using my own engine as a sgie.

Yes, in this case.

No, in this case, pgie is detector model, but sgie in the config is set to classifier model, you can not use detector model as classifier model. that’s what i mean.

1 Like

@Amycao I still do not understand. You say sgie can not be a classification model but in deepstream-test2 all the sgie are classification. Can you elaborate on this or point me do documentation on the topic?

Oh, no. not like this, I mean in your case, you change sgie to pgie model, right? what i say is since in sgie config it set as classifier, but detector can not be used as classifier.

I mean in your case, you change sgie to pgie model, right?

I am not sure what you are asking, can you please clarify? I am using a classification model for the pgie element and a classification model for sgie element.

what i say is since in sgie config it set as classifier, but detector can not be used as classifier.

My apologies, I do not understand what you are saying. Could you point me to the documentation for this?

@amycao The sgie model is the same as the pgie model. Running the pgie alone without sgie works fine.

You said you use pgie model as sgie model right?

You said you use pgie model as sgie model right?

I think I am not understanding. I have a single classification model. I am attempting to use this model for both the pgie and sgie just to test the secondary inference functionality.

Are you saying I would not have the error if I used a different classification model for sgie?