My question here is twofold:
a. Is there a way to avoid doing decoding/encoding in the first pipeline?
b. How can the second pipeline reconnect to camera after EOS?
This is related to your source, if the stream in your source is an H264 stream, you don’t have to use the decoding/encoding part.
You need to implement the reconnection code yourself. You can refer to our source code: watch_source_status in sources\apps\apps-common\src\deepstream_source_bin.c.
Thanks for your answer.
My source is h264 indeed, how can I fine tune the deepstream bin/elements to avoid decoding/encoding using elements properties or python?
You can just use the piepline 2 you attached. What do you mean how can I fine tune the deepstream bin/elements to avoid decoding/encoding using elements properties or python?
I know I can use pipeline2, but it doesn’t have the ability to do rtsp-reconnect-interval as nvurisrcbin, nor nvurisrcbin is interchangable with rtspsrc.
In short: I need the functionality of pipeline2 with the ability to reconnect with cameras.
You mentioned watch_source_status source code, but I need it in python. :-/
Thanks a lot
OK, I know about the smart-record, but would like to know if it can be triggered without any external message. I want to keep the smart record running 7x24 but interrupt it whenever free disk storage reaches a certain limit.
Thanks
You can achieve your needs through the following methods: add and remove the source.
Please refer to our README:opt\nvidia\deepstream\deepstream\sources\apps\sample_apps\deepstream-server\README
Thanks for your response.
It’s kind of a workaround what you’re proposing for a “recording only” pipeline.
In case my pipeline is nvmultiurisrcbin → nvinferserver → nvmsgbrokersinkbin so there’s no way to do add/remove source.
At present, the nvmultiurisrcbin plugin has not implemented similar functions. If you want to implement it on your end, you need to refer to opt\nvidia\deepstream\deepstream\sources\apps\sample_apps\deepstream-testsr yourself.