Is it possible to store stream simultaneously before and after inference while running Deepstream?

Please provide complete information as applicable to your setup.

• Tesla T4 (GPU)
• DeepStream 5.1
• TensorRT 7.2.2
• NVIDIA GPU Driver 460.32.03
• Questions
**• I can store stream with inference ON. I can store stream with inference OFF. Can I do both with a stream from the same source?

How did you store stream?

My message might have been misleading, what I really meant is that the video stream is being stored as video (mp4, mkv) using standard Deepstream sink functions. Not in the original format of the stream.

Surely you can. You can add encoder, muxer and filesink before nvstreammux with tee

https://gstreamer.freedesktop.org/documentation/tutorials/index.html?gi-language=c

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.