Which is the better way of video storage in jetson

Hi,

I am using jetson nano for inferencing with 4 rtsp cctv camera and running deep stream peoplenet on it. i am achieving around 25fps. and in another thread i am running video storage on motion detection script. I have no issues with inferencing but i have issue while doing video storage. Every 1 minute 10 seconds of video is skipped very fast. Kindly let me know, which way i can use to store the videos in localstorage without skip and fast running issues.

Hi,
Please follow the README to set up the environment:

/opt/nvidia/deepstream/deepstream-6.0/samples/configs/tao_pretrained_models/README.md

The sample config file is deepstream_app_source1_peoplenet.txt. By default is it single source. Please try to enable sink1 for saving to a mp4 and see if the file can achieve target frame rate. And then try 2 sources, 3 sources, 4 sources. To check how many sources trigger frame rate dropping.

Hi,

Thanks for your response. My need is to not store the entire video stream. i want to store the video only if motion detected. It is for CCTV camera storage application. Is there any way in it?

Hi,
For this use-case, you may enable smart recording for a try. Please take a look at
Smart Video Record — DeepStream 6.1 Release documentation

Hi,

The use case u gave is in C++… Can i get anything in python?

Hi,
Currently smart recording is enabled in C. For using python, since python binding is open source, you may check how the function works in C, and port the function to python binding.

Ok

@DaneLLL I have referred that C++ api and tried implementing on python. i have attached my script below. I can able to store the video. after video storage file size is showing correctly. But i couldn’t open the video file.

four-channel-multistream_deepstream_record copy.py (15.1 KB)

Hi,
You may try matroskamux to mux into mkv. It needs to send EoS for muxing into a valid mp4.

Hi,

Tried following changes but still same issue is there. Kindly let me know if I tried anything wrong.

Try 1:

muxer = Gst.ElementFactory.make(“matroskamux”, “mux”)
sink = Gst.ElementFactory.make(“filesink”, “filesink”)
sink.set_property(‘location’, “output.mp4”)

Try 2:

muxer = Gst.ElementFactory.make(“matroskamux”, “muxer”)
sink = Gst.ElementFactory.make(“filesink”, “filesink”)
sink.set_property(‘location’, “output.mp4”)

Try 3:

muxer = Gst.ElementFactory.make(“qtmux”, “mux”)
sink = Gst.ElementFactory.make(“filesink”, “filesink”)
sink.set_property(‘location’, “output.mp4”)

Hi,
You may check if the constructed pipeline can be successfully run in gst-launch-1.0 command. May refer to:
Nvmultistreamtiler/nvstreamdemux with omxh264enc won't work - #5 by DaneLLL

And try like:

… ! nvv4l2h264enc ! h264parse ! matroskamux ! filesink location=a.mkv

… ! nvv4l2h264enc ! h264parse ! mpegtsmux ! filesink location=a.ts

Hi,

I tried your suggestion. Now video is storing and also playing properly. But now problem is that, if i use 4 camera streaming, single mp4 file only created for 4 cameras like collage format. I tried to store the video separately for each camera but couldnt acheive it. Can u suggest me a solution for this?

four-channel-multistream_deepstream_record_copy.py (14.6 KB)

Hi,
For this use-case, you would need to replace nvmultstreamtiler with nvstreamdemux plugin. It is demonstrated in deepstream-app and the source code is in

/opt/nvidia/deepstream/deepstream-6.0/sources/apps

If you run deepstream-app, you can modify config file to run the use-case. Here is an example:
How to save output videos deepstream app to individual files? And what is need to change in congif f... - #4 by DaneLLL

There is no existing python sample, so would need to refer to the C sample and apply the same to python.

Hi,

Now i can able to save individual camera separately. But i wanna add motion detection to the storage. I am already running the person detection inference in the existing code… But now i want to store the video only if motion detected… Can you tell me where i need to add motion detection script? or is there any other option for this?