Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 6.3
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only) 550
• Issue Type( questions, new requirements, bugs) question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
According to my issue in NVIDIA-AI-IOT Github Repo
How to use RTSP Server as input. · Issue #36 · NVIDIA-AI-IOT/deepstream_lpr_app (github.com)
I try to use RTSP as input in lpr_app_infer_us_config.yml
source-list:
use-nvmultiurisrcbin: 1
list: rtsp://192.168.11.244:8554/media.smp
source-attr-all:
enable: 1
type: 3
num-sources: 1
gpu-id: 0
cudadec-memtype: 0
latency: 100
rtsp-reconnect-interval-sec: 0
.........................
When I run command ./deepstream-lpr-app lpr_app_infer_us_config.yml
It show error like this
ERROR from element file_src_0: Resource not found.
Error details: gstfilesrc.c(532): gst_file_src_start (): /GstPipeline:pipeline/GstFileSrc:file_src_0:
No such file "rtsp://192.168.11.244:8554/media.smp"
Returned, stopping playback
Average fps 0.000233
Totally 0 plates are inferred
Deleting pipeline
Please suggest how to use RTSP Server as input of deepstream_lpr_app, Thank you.
Note: I have been test This RTSP Server, It’s working properly.