Read video streams from a queue and input them into DeepStream

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU)
• DeepStream Version
• JetPack Version (valid for Jetson only)
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)
• Issue Type( questions, new requirements, bugs) questions
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)
• Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)

Hello everyone,

I am looking to create a custom system, and my initial step is to generate video segments—such as 30-second clips—using FFmpeg.

Instead of saving these segments to disk, I plan to send them directly to a queue.

After that, I need to develop a custom DeepStream pipeline. DeepStream will pull the videos from the queue and feed them into the pipeline. As you may know, this process occurs quickly, and DeepStream will continue this cycle until all videos in the queue have been processed.

I also came across a similar question that lacked a solution, such as: Using VMS like Network Optix with DeepStream

Is it feasible to accomplish this task?

You can try the following two ways.

  1. Use ffmpeg to output the rtsp streams directly, so that you can use our sample directly
  2. Refer to our deepstream-appsrc-test sample, use appsrc plugin to to implement your needs. You need to control the reading of the video data yourself.

I want to generate short videos-up to 30 seconds- and send them to the queue.

Then you can only choose the second option, using appsrc plugin to implement that feature. Please refer to our deepstream-appsrc-test I attached before.

let me check it. I think I should create a new topic for specific question

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.