Error running deepstream testsr

I get this error when I try running deepstream-6 testsr app C++

ples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_int8.engine failed, try rebuild
0:00:04.553301165 24934 0x55a9815aa0 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1914> [UID = 1]: Trying to create engine from model files
WARNING: INT8 not supported by platform. Trying FP16 mode.
WARNING: [TRT]: Detected invalid timing cache, setup a local cache instead
0:01:36.709911008 24934 0x55a9815aa0 INFO nvinfer gstnvinfer.cpp:638:gst_nvinfer_logger: NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::buildModel() <nvdsinfer_context_impl.cpp:1947> [UID = 1]: serialize cuda engine to file: /opt/nvidia/deepstream/deepstream-6.0/samples/models/Primary_Detector/resnet10.caffemodel_b1_gpu0_fp16.engine successfully
INFO: [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x368x640
1 OUTPUT kFLOAT conv2d_bbox 16x23x40
2 OUTPUT kFLOAT conv2d_cov/Sigmoid 4x23x40

0:01:37.109555400 24934 0x55a9815aa0 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus: [UID 1]: Load new model:dstestsr_pgie_config.txt sucessfully
Running…
Recording started…
In cb_newpad
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
reference in DPB was never decoded
0:01:41.193066648 24934 0x55a939b5e0 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop: error: Internal data stream error.
0:01:41.193125764 24934 0x55a939b5e0 WARN nvinfer gstnvinfer.cpp:2288:gst_nvinfer_output_loop: error: streaming stopped, reason error (-5)
ERROR from element primary-nvinference-engine: Internal data stream error.
Error details: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(2288): gst_nvinfer_output_loop (): /GstPipeline:dstest-sr-pipeline/GstNvInfer:primary-nvinference-engine:
streaming stopped, reason error (-5)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

• Hardware Platform (Jetson / GPU) – Jetson Nano
• DeepStream Version — 6.0
• JetPack Version (valid for Jetson only) --Jetpack 4.6
• TensorRT Version – 8.0.1.6
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)

Its doesn’t run when I run the program - outputs the above error

• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

/opt/nvidia/deepstream/deepstream-6.0/sources/apps/sample_apps/deepstream-testsr$

sudo ./deepstream-testsr-app rtsp:// --enc-type=1 --sink-type=1 --bbox-enable=1 --sr-mode=0

• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

I want to record a short video once some one is detected and send it to AWS cloud either through kafka or anything that works and then provision this on my mobile app as a notification

You need to provide rtsp uri with h264 video stream. not rtsp://

Does this mean I can’t stream from a raspberry pi zero Ip camera? Just need to be sure because my Ip camera was 10.42.0.68 but it still didn’t work

Raspberry pi is not a rtsp camera. you may refer to this for turing it into an rtsp camera.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.