Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs)
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
Hello Team,
Iam trying to run Deepstream sdk on top of the docker image which has been provided by NVIDIA.
When iam tying to run deepstream with RTSP streaming as an input iam getting the below error.
Could you please helps us in accomplisihng that ??
Docker image: nvcr.io/nvidia/ deepstream :5.0.1-20.09-triton
Error Logs:
*** DeepStream: Launched RTSP Streaming at rtsp://localhost:10001/ds-test ***
Unknown or legacy key specified ‘tlt-encode-model’ for group [property]
Warning: ‘input-dims’ parameter has been deprecated. Use ‘infer-dims’ instead.
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_mot_klt.so
gstnvtracker: Optional NvMOT_RemoveStreams not implemented
gstnvtracker: Batch processing is OFF
gstnvtracker: Past frame output is OFF
WARNING: …/nvdsinfer/nvdsinfer_func_utils.cpp:36 [TRT]: Current optimization profile is: 0. Please ensure there are no enqueued operations pending in this context prior to switching profiles
0:00:02.064866178 10244 0x562a806eec60 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::deserializeEngineAndBackend() <nvdsinfer_context_impl.cpp:1701> [UID = 1]: deserialized trt engine from :/opt/nvidia/deepstream/deepstream-5.0/sources/test/experiment_dir_final_600epochs/resnet18_detector.trt
INFO: …/nvdsinfer/nvdsinfer_model_builder.cpp:685 [Implicit Engine Info]: layers num: 3
0 INPUT kFLOAT input_1 3x544x960
1 OUTPUT kFLOAT output_bbox/BiasAdd 8x34x60
2 OUTPUT kFLOAT output_cov/Sigmoid 2x34x60
0:00:02.064960032 10244 0x562a806eec60 INFO nvinfer gstnvinfer.cpp:619:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Info from NvDsInferContextImpl::generateBackendContext() <nvdsinfer_context_impl.cpp:1805> [UID = 1]: Use deserialized engine model: /opt/nvidia/deepstream/deepstream-5.0/sources/test/experiment_dir_final_600epochs/resnet18_detector.trt
0:00:02.072704945 10244 0x562a806eec60 INFO nvinfer gstnvinfer_impl.cpp:313:notifyLoadModelStatus:<primary_gie> [UID 1]: Load new model:/opt/nvidia/deepstream/deepstream-5.0/sources/final_check/ds_configs/primary_600.txt sucessfully
Runtime commands:
h: Print this help
q: Quit
p: Pause
r: Resume
NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.
** INFO: <bus_callback:181>: Pipeline ready
**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
ERROR from src_elem0: Could not open resource for reading and writing.
Debug info: gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0:
Failed to connect. (Generic error)
** INFO: <reset_source_pipeline:1154>: Resetting source 0
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
ERROR from src_elem0: Could not open resource for reading and writing.
Debug info: gstrtspsrc.c(7469): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0:
Failed to connect. (Generic error)
**PERF: 0.00 (0.00)
**PERF: 0.00 (0.00)
Screenshot: