• Hardware Platform (Jetson / GPU) Jetson • DeepStream Version 7 • Issue Type( questions, new requirements, bugs) Question
I have multiple rtsp sources that I tee right after h264parse. One branch of the tee gets decoded and passed into nvstreammux and nvinfer, and the other branch goes into hlssink2 for video persistence. My problem is I need to be able to correlate frames between what comes out of nvinfer on one branch and what goes into hlssink2 on the other.
rtspsrc >> rtph264depay >> h264parse >> tee
(branch 1) tee >> queue >> hlssink2
(branch 2) tee >> nvv4l2decoder >> queue >> nvstreammux >> nvinfer
I have ntp times successfully being added to the frames going down the nvstreammux/nvinfer branch (attach-sys-ts=0 and calling configure_source_for_ntp_sync on the rtspsrc) and I can pull them off pad of nvinfer.
The problem is the ntp time is added to the buffer as the output of nvstreammux and I don’t have ntp time on the other branch to do the correlation. In another thread I saw that nvstreammux is open source so I thought about duplicating the code in a component that gets put before the hlssink, but this looks like a heavy lift. Do I need to go through that heavy lift or is there another way to correlate?
This sounds like it would be a common problem in using gstreamer with inference and I tried looking for any kind of gstreamer built in framenumber or id that would be carried through the whole pipeline… but I couldn’t find anything. I tried adding my own correlation id using g_object_set_qdata and this makes it down branch 1, but it is dropped at nvstreammux of branch 2.
I think I have a solution to this now but want to double check my understanding of the various timestamps. Once I get ntp_timestamps on the NvDsFrameMeta I noticed the pts and ntp_timestamp have the same delta from frame to frame so they seem to both sourced from the ntp time passing through the rtspsrc. pts appears to just be the running time of the pipeline but sourced from ntp where ntp_timestamp is the full time. Furthermore pts appears to be the same throughout the entire pipeline even for branches not going through nvstreammux. So I think my strategy is for each rtspsrc (and corresponding stream id), I can just figure out the delta between pts and ntp_timestamp from a pad that is post nvstreammux. Then, anywhere in the pipeline I can calculate the samples ntp time by adding that delta to the pts. Pts alone would solve for the correlation, but I’m ultimately after the source time (ntp) when persisting video out so I know its src time for each frame. So far I haven’t seen a drift between the two. Does this sound like a good approach?
The two branches after the “tee” are not related. You generate two live streams, each stream has its own timing.
The frames in the two branches have the same timestamps(PTS) before they are mux and sent out. There is no method to correlate the two live streams with the standard protocols.
There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks