Error: Internal data stream error when use deepstream-testsr-app in deepstream 6.1

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
P40
• DeepStream Version
Deepstream 6.1 with docker 6.1-devel

05/20/2022 1:38 AM

• JetPack Version (valid for Jetson only)
• TensorRT Version
docker inside
• NVIDIA GPU Driver Version (valid for GPU only)
470 。but the deepstream-app demo is ok.

• Issue Type( questions, new requirements, bugs)
qustion

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

I use “./deepstream-testsr-app rtsp://172.17.0.1:9554/001” command to test smart record.but can’t get correct result.
the stream:rtsp://172.17.0.1:9554/001 is genrated by ffmpeg -re -stream_loop -1 -i sample_1080p_h264.mp4 -c copy -f rtsp rtsp://172.17.0.1:9554/001 ,and it’s perfect in vlc player.

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

this is result:


root@78501008a01b:/opt/nvidia/deepstream/deepstream-6.1/sources/apps/sample_apps/deepstream-testsr# ./deepstream-testsr-app rtsp://172.17.0.1:9554/001
Now playing: rtsp://172.17.0.1:9554/001
0:00:01.640960730 43799 0x5618ba5e34c0 INFO                 nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1900> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-6.1/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
INFO: ../nvdsinfer/nvdsinfer_model_builder.cpp:610 [Implicit Engine Info]: layers num: 3
0   INPUT  kFLOAT input_1         3x368x640       
1   OUTPUT kFLOAT conv2d_bbox     16x23x40        
2   OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40         

0:00:01.643241826 43799 0x5618ba5e34c0 INFO                 nvinfer gstnvinfer.cpp:646:gst_nvinfer_logger:<primary-nvinference-engine> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:2003> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-6.1/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine
0:00:01.644242641 43799 0x5618ba5e34c0 INFO                 nvinfer gstnvinfer_impl.cpp:328:notifyLoadModelStatus:<primary-nvinference-engine> [UID 1]: Load new model:dstestsr_pgie_config.txt sucessfully
Running...
In cb_newpad
In cb_newpad
Recording started..
0:00:11.039747643 43799 0x5618bb28a120 WARN                 nvinfer gstnvinfer.cpp:2299:gst_nvinfer_output_loop:<primary-nvinference-engine> error: Internal data stream error.
0:00:11.039787328 43799 0x5618bb28a120 WARN                 nvinfer gstnvinfer.cpp:2299:gst_nvinfer_output_loop:<primary-nvinference-engine> error: streaming stopped, reason not-negotiated (-4)
ERROR from element primary-nvinference-engine: Internal data stream error.
Error details: gstnvinfer.cpp(2299): gst_nvinfer_output_loop (): /GstPipeline:dstest-sr-pipeline/GstNvInfer:primary-nvinference-engine:
streaming stopped, reason not-negotiated (-4)
Returned, stopping playback
Deleting pipeline

two mp4 file is generated. but very small,one file only 616 bytes.another has 62k,and can show a correct image but can’t play.

How about add --sink-type=1 to the command line and run again?

@tms2003 Is it still an issue? or shall we close this topic?