Smart record once object is detected

Hardware/Software Info:

  • Hardware Platform (Jetson / GPU): Jetson Orin Nano

  • DeepStream Version: 7.1.0

  • JetPack Version: 6.1

  • TensorRT Version: 10.3.0.30

Hello,

I am working on a DeepStream 7.1.0 pipeline on a Jetson Orin Nano. I would like to implement Smart Recording such that:

  1. A vehicle is detected in a specific region of interest (ROI) in the frame.

  2. Recording starts automatically when the vehicle enters this zone.

  3. Recording stops automatically when the vehicle leaves this zone.

I want to do this fully on the Jetson Orin Nano, with hardware-accelerated encoding.

So far, I have tried using NvDsSRStart and NvDsSRStop in a pad probe..

My questions:

  • Is there a recommended way to trigger Smart Recording per object in a specific ROI?

  • Are there examples or best practices for handling START/STOP ?

  • Any Jetson-specific tips for ensuring smooth Smart Recording with DeepStream?

*** Please note that this reply is generated by LLM automatically ***

To implement Smart Recording on a Jetson Orin Nano using DeepStream, you can leverage the power of NVIDIA’s hardware-accelerated encoding and the flexibility of the DeepStream pipeline. Here’s a comprehensive guide to help you achieve your goal:

Recommended Approach:

  1. Use the nvmsgbroker plugin: This plugin allows you to send and receive messages between different components of the DeepStream pipeline. You can use it to trigger the recording when a vehicle enters the zone and stop the recording when it leaves.
  2. Implement a custom pad probe: You can write a custom pad probe to detect when a vehicle enters or leaves the zone. This probe can send a message to the nvmsgbroker plugin to trigger the recording.
  3. Use the nvdsvideotemplate plugin: This plugin provides a template for video encoding and can be used to encode the video stream in real-time.

Handling START/STOP:

  1. Use the nvmsgbroker plugin’s start and stop messages: You can send a start message to the nvmsgbroker plugin when a vehicle enters the zone, and a stop message when it leaves.
  2. Implement a custom pad probe to detect zone entry/exit: You can write a custom pad probe to detect when a vehicle enters or leaves the zone. This probe can send a message to the nvmsgbroker plugin to trigger the recording.

Jetson-specific tips for smooth Smart Recording:

  1. Use hardware-accelerated encoding: The Jetson Orin Nano has a built-in hardware encoder that can be used to encode the video stream in real-time. This will help reduce the CPU usage and ensure smooth recording.
  2. Optimize the DeepStream pipeline: Make sure to optimize the DeepStream pipeline to minimize latency and ensure smooth recording.
  3. Use a high-quality camera: Use a high-quality camera to capture clear and sharp video footage.

Examples and Best Practices:

  1. NVIDIA’s DeepStream SDK: The DeepStream SDK provides a comprehensive set of examples and documentation to help you get started with Smart Recording.
  2. NVIDIA’s Jetson Orin Nano documentation: The Jetson Orin Nano documentation provides information on how to optimize the DeepStream pipeline and use hardware-accelerated encoding.
  3. DeepStream community forum: The DeepStream community forum is a great resource to ask questions and get help from other developers who have implemented Smart Recording on the Jetson Orin Nano.

Here’s an example of how you can implement Smart Recording using the nvmsgbroker plugin and a custom pad probe:

import gi
gi.require_version('Gst', '1.0')
from gi.repository import Gst

# Create a DeepStream pipeline
pipeline = Gst.Pipeline()

# Create a video source
video_source = Gst.ElementFactory.make('nvarguscamerasrc', 'video-source')

# Create a video encoder
video_encoder = Gst.ElementFactory.make('nvdsvideotemplate', 'video-encoder')

# Create a message broker
msg_broker = Gst.ElementFactory.make('nvmsgbroker', 'msg-broker')

# Create a custom pad probe to detect zone entry/exit
def pad_probe(pad, info):
    # Detect when a vehicle enters or leaves the zone
    if info.type == Gst.PadProbeType.BUFFER:
        # Send a message to the msg_broker to trigger recording
        msg_broker.send_message('start' if vehicle_enters_zone else 'stop')

# Add the pad probe to the video source
video_source.get_static_pad('src').add_probe(Gst.PadProbeType.BUFFER, pad_probe)

# Add the video source, video encoder, and message broker to the pipeline
pipeline.add(video_source)
pipeline.add(video_encoder)
pipeline.add(msg_broker)

# Link the elements
video_source.link(video_encoder)
video_encoder.link(msg_broker)

# Set the pipeline to playing
pipeline.set_state(Gst.State.PLAYING)

This example demonstrates how to create a DeepStream pipeline with a video source, video encoder, and message broker. It also shows how to implement a custom pad probe to detect when a vehicle enters or leaves the zone and send a message to the nvmsgbroker plugin to trigger the recording.

*** Please give the thumbs up if you get answers you like. Or provide feedback to help us improve the answer. ***

  1. you can use API to NvDsSRStart/ NvDsSRStopto start/end smart recording. Please refer to the native sample deepstream-testsr. Could you share the complete media pipeline?
  2. regarding automatically start/stop, is there is only one vehicle entering the ROI each time?

Hi,

Thank you for your suggestion. I’m quite new to DeepStream and NVIDIA Smart Recording. I understand that NvDsSRStart and NvDsSRStop are used to start and stop the recording.

Regarding your question about the media pipeline, I’m currently using a single RTSP H.264 stream in a basic pipeline similar to the deepstream-testsr sample.

About the ROI detection, I’m still learning, so I’m not sure about handling multiple vehicles yet — for now, I just want to understand how to automatically start/stop recording when an object enters a predefined ROI.

Could you help me understand the steps for that as a beginner?

Thank you very much!

  1. regarding ROI checking, Please refer to the native sample deepstream-nvdsanalytics-test. after analyzing by ‘nvdsanalytics’ plugin, the analysis results will be in NvDsAnalyticsFrameMeta. Please refer to parse_nvdsanalytics_meta_data for how to access NvDsAnalyticsFrameMeta. Please refer to /opt/nvidia/deepstream/deepstream/sources/includes/nvds_analytics_meta.h for the structure explanation.
  2. regarding automatically start/stop recording, Here are some ideas.
    a> if meta->objInROIcnt changes from 0 to ‘>0’, meaning there are some objects entering the ROI, start recording.
    b> if meta->objInROIcnt is still ‘>0’, and the recording is on, do nothing.
    c> if meta->objInROIcnt changes from ‘>0’ to 0, meaning all objects have left the ROI, stop recording.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks.