How to use RTSP Server as input. (deepstream_lpr_app in NVIDIA-AI-IOT)

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) GPU
• DeepStream Version 6.3
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only) 550
• Issue Type( questions, new requirements, bugs) question
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

According to my issue in NVIDIA-AI-IOT Github Repo
How to use RTSP Server as input. · Issue #36 · NVIDIA-AI-IOT/deepstream_lpr_app (

I try to use RTSP as input in lpr_app_infer_us_config.yml

   use-nvmultiurisrcbin: 1
   list: rtsp://

  enable: 1
  type: 3
  num-sources: 1
  gpu-id: 0
  cudadec-memtype: 0
  latency: 100
  rtsp-reconnect-interval-sec: 0

When I run command ./deepstream-lpr-app lpr_app_infer_us_config.yml

It show error like this

ERROR from element file_src_0: Resource not found.
Error details: gstfilesrc.c(532): gst_file_src_start (): /GstPipeline:pipeline/GstFileSrc:file_src_0:
No such file "rtsp://"
Returned, stopping playback
Average fps 0.000233
Totally 0 plates are inferred
Deleting pipeline

Please suggest how to use RTSP Server as input of deepstream_lpr_app, Thank you.

Note: I have been test This RTSP Server, It’s working properly.

There is no update from you for a period, assuming this is not an issue any more. Hence we are closing this topic. If need further support, please open a new one. Thanks.
filesrc can’t support RTSP source. this code is opensource. you can use uridecodebin or nvurisrcbin plugin. please refer to deepstream-test3 sample.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.