Hello i would like to know how to solve this issue, this happen always i change the batch size of nvvstreammux, here you can find the error and also the pipeline to reproduce the issue.
Please let me know how to solve.
•Jetapck 4.5
•DS 5.0
•Jetson Xavier AGX
•Num Sources 2
•Inference model Peoplenet
•GST Debug DEBUG LEVEL = 1*
•ISSUE and scenario description
•STREAM FROM RTSP
Working scenario
Batch size 1
batched-push-timeout =-1
pipeline:
gst-launch-1.0 -v uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=1 batched-push-timeout=-1 width=1920 height=1080 live-source=1 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_1
Not Working scenario
Batch size 2
push timeout =40000
pipeline:
gst-launch-1.0 -v uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=2 batched-push-timeout=40000 width=1920 height=1080 live-source=1 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_1
•STREAM FROM FILE
Working scenario
Batch size 1
batched-push-timeout =-1
pipeline:
gst-launch-1.0 -v uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=1 batched-push-timeout=-1 width=1920 height=1080 live-source=0 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_1
Not Working scenario
Batch size 2
batched-push-timeout =40000
pipeline:
gst-launch-1.0 -v uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=2 batched-push-timeout=40000 width=1920 height=1080 live-source=0 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_1
•ISSUE DETECTED
0:00:05.662097222 8758 0x558ea54000 ERROR nvvideoconvert gstnvvideoconvert.c:3120:gst_nvvideoconvert_transform: buffer transform failed
0:00:05.685226044 8758 0x558ec6c0f0 WARN nvinfer gstnvinfer.cpp:1975:gst_nvinfer_output_loop: error: Internal data stream error.
0:00:05.685351934 8758 0x558ec6c0f0 WARN nvinfer gstnvinfer.cpp:1975:gst_nvinfer_output_loop: error: streaming stopped, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstNvInfer:nvinfer0: Internal data stream error.
•JETPACK 4.6
•DS 6.0
•Jetson Xavier AGX
•Num Sources 2
•OLD_STREAM_MUX USED not BETA
•Inference model Peoplenet
•GST Debug DEBUG LEVEL = 1
•STREAM FROM RTSP
Working scenario
Batch size 1
batched-push-timeout =-1
pipeline:
gst-launch-1.0 -v uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=1 batched-push-timeout=-1 width=1920 height=1080 live-source=1 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_1
Not Working scenario
Batch size 2
push timeout =40000
pipeline:
gst-launch-1.0 -v uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=2 batched-push-timeout=40000 width=1920 height=1080 live-source=1 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=“rtsp://root:pass@192.168.3.109/axis-media/media.amp?resolution=1920x1080” ! queue ! nvstreammux0.sink_1
•STREAM FROM FILE
Working scenario
Batch size 1
batched-push-timeout =-1
pipeline:
gst-launch-1.0 -v uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=1 batched-push-timeout=-1 width=1920 height=1080 live-source=0 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_1
Not Working scenario
Batch size 2
batched-push-timeout =40000
pipeline:
gst-launch-1.0 -v uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_0 nvstreammux name=nvstreammux0 batch-size=2 batched-push-timeout=40000 width=1920 height=1080 live-source=0 ! queue ! nvinfer config-file-path=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.inference model-engine-file=/srv/MLManager/media/uploads/models/model_2e5ee5f5-ef21-4c2c-b769-dff6ef7c2ac2/file.model ! queue ! nvvideoconvert ! video/x-raw,format=RGBA ! videoconvert ! fakesink silent=false uridecodebin uri=file:///opt/nvidia/deepstream/deepstream/samples/streams/sample_1080p_h264.mp4 ! queue ! nvstreammux0.sink_1
•ISSUE DETECTED
0:00:05.662097222 8758 0x558ea54000 ERROR nvvideoconvert gstnvvideoconvert.c:3120:gst_nvvideoconvert_transform: buffer transform failed
0:00:05.685226044 8758 0x558ec6c0f0 WARN nvinfer gstnvinfer.cpp:1975:gst_nvinfer_output_loop: error: Internal data stream error.
0:00:05.685351934 8758 0x558ec6c0f0 WARN nvinfer gstnvinfer.cpp:1975:gst_nvinfer_output_loop: error: streaming stopped, reason error (-5)
ERROR: from element /GstPipeline:pipeline0/GstNvInfer:nvinfer0: Internal data stream error.