DeepStream-based application for a multi-camera setup and need advice on designing the pipeline

Please provide complete information as applicable to your setup.

• Hardware Platform ( GPU)
**• DeepStream Version 7.1 **

• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only) 12.6

I am working on a DeepStream-based application for a multi-camera setup and need advice on designing the pipeline. Below is the structure of the job configuration our system uses and the challenges we face.

Job Template (Overview):

Each job consists of:

  1. Job Metadata:
  • Job ID (id): Unique identifier for the job.
  • Name (name): Descriptive name for the task, e.g., PPE Demo.
  • Associated Site (site_id): ID linking this job to a physical location or project.
  • Status (status): Tracks the lifecycle of the job (e.g., initializing, active).
  • Created and Updated Timestamps (created_at, updated_at).
  1. Task Modules:
  • A set of enabled or disabled modules defining the types of tasks to perform. Examples:
    • PPE: Personal protective equipment detection (enabled in this job).
    • Forklift: Forklift detection (disabled).
    • Work at Height: Detection of hazardous work conditions (disabled).
  1. Streams:
  • Multiple video streams can be associated with a job. Each stream has:
    • Stream ID (id) and Name (name): Unique identifiers and labels.
    • RTSP URL (rtsp_url): The video stream to be processed.
    • Status (status): Indicates the stream’s state (e.g., active).
    • Task ID (task_id): N/A
    • Task Status (task_status): N/A

Challenges:

  1. Single Pipeline vs. Multiple Pipelines:
  • Should we implement a single pipeline to handle all streams and modules, or separate pipelines for each job/stream?
  • In a single pipeline, how can we dynamically enable or disable models (e.g., skip PPE detection if “PPE” is disabled)?
  1. Routing and Scalability:
  • For jobs with multiple streams and modules, what’s the best way to route streams to specific models (e.g., one stream processes only PPE, another processes multiple tasks)?
  • Can a single pipeline scale to handle 10–20 streams with varied tasks?

What does the “task” mean? The model?

In this context task_id or task_status are not using any where and not related to this …

Can you explain the “tasks” in this description?

tasks means another stream with multiple models

and also once the pipeline is started with initially with one or two streams is it possible to add more streams during running the pipeline …