Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) GPU • DeepStream Version 6.4 • JetPack Version (valid for Jetson only) • TensorRT Version • NVIDIA GPU Driver Version (valid for GPU only) 535 • Issue Type( questions, new requirements, bugs) Question • How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing) • Requirement details( This is for new requirement. Including the module name-for which plugin or for which sample application, the function description)
I have a pipeline to both record videos to disk and do analysis that are saved to a database and would like to point out which video file and frame the metadata corresponds to
The current pipeline pseudo code:
source bin-> Tee → queue ->Recorder pipeline ->Splitmuxsink
tee-> queue → nvstreammux → analysis pipeline → save to db
I don’t want to save the bounding-boxes etc with the video and chose to add it in the beginning to get the original video-resolution before it enters the nvstreammux.
I don’t know how to synchronize the analysis together with the result.
I can get messages from the splitmuxsink for the current filelocation. But there are ques in between so I can’t be sure if the current frame in the analysis actually corresponds to that filename or if it has just changed.
The frame-number that are added as metadata are added from the stream-mux, right? and that frame-number are not reset to 0 from the splitmuxsink when a new file is created, Or is it possible to do somehow?
The filename contains system timestamp when the file is created. But there is another system-timestamp added as metadata from the streammux for the frames that are sent for analysis. I still cant be sure that the timestamps corresponds because of the queues.
Do you have any creative ideas of how to save the metadata to the database and point out filename and framenumber for each frame?
I’m not sure if that answers the question but it’s still a deepstream question.
The pipeline are recording and analysing an RTSP stream that has no start or end. That’s why the splitmuxsink are in place to make nice pieces of the video files. Each video-file starts at frame number 0 while the nvstreammux still keeps counting the frames. The splitmuxsink splits video files using time or filesize but not on frame count.
I can stream the analysis result to a database and would like to include the exact file and frame-number that the result corresponds to. The recording part of the pipeline and the analyzing part are not in sync because of threads and queues. That’s why I would like to find a way to syncronize them to know which frame an file that the result belongs to. Or other solutions. Do you have a suggestion?
if needing to set up a relation, file clips and analysis need to save some same thing which is used to set up relation. here are some ideas.
saving timestamp. frame meta’s buf_pts is the timestamp of original gst buffer. you can save the buf_pts to the clips and analysis respectively. then you can make some relations between timestamp and clip name and frame number.
saving encoded frame. you can save the analysis with the encoded frame. then you can find the encode frame in the clips.
Thanks! I will try using the buf_pts, thats seems to be a nice way.
I implemented another more complicated method while hoping for an answer.
I start the pipeline together with a thread that looks for the resulting video-clips and populates the database with file-information. It also accumulates the frame-number for each video-clip.
eg file1{ 265 frames, start_frame: 0 end_frame264} file 2 {260 frames, start_frame 265, end_frame 524}
I can then update the file information for each object that has a framenumber beteen 265 and 524 to correspond to file 2.
Only videos clips with objects will be uploaded to the server by the thread.