In the case of multi-source input, how should I get information about the detection target in multiple sources?

  • Jetson orin

NvDsBatchMeta contains data under multiple time series, if I use its NvDsFrameMetaList directly, the source_id I get may be duplicated. How should I use this metada to get the data of all channels at once?
Example: For example, if I am the input source of three channels, then when I use NvDsFrameMetaList of NvDsBatchMeta, the GPU may output the batch two frames at a time, and I traverse NvDsFrameMetaList, then my source_id output may be 0, 1, 2, 0, 1, 2. But I need 0, 1, 2
Purpose: The purpose of this is to ensure that the identified content of all channels that can be output at the same moment, is only for that moment.
I don’t know if I understand NvDsFrameMetaList and NvDsFrameMetaList correctly, if this is the way to handle it, how should I implement my requirement?

You mean two batched buffers or two frames in the same buffer? The output is a set of batched frames in one Gstbuffer.

I mean two source in the same buffer
Known NvDsBatchMeta—> NvDsFrameMetaList
I need to get the data of all source once in NvDsBatchMeta, how should I get it? ?
NvDsFrameMetaList may contain a lot of frame data, and these data may not be arranged in the order of source_id, and may not just contain the data of all channels once, for example, it may be source0, source1, source2, source0 data, And what I need is the data of source0, source1, source2

This is determined by the plugin that batches the frame, like Gst-nvstreammux. You can try to use the Gst-nvstreammux New to set the max-same-source-frames=1 parameter in the Mux Config File.

  1. Gst-nvstreammux New plugin does not seem to have this max-same-source-frames attribute.
  2. It is mentioned in the nvmultistreamtiler plugin that "obtained from NvDsBatchMeta and NvDsFrameMeta in row-major order (starting from source 0, left to right across the top row, then across the next row). Each source frame is scaled to the corresponding location in the tiled output buffer.”. How does it get the channel order?

1.There is a config file for Gst-nvstreammux New, you can refer to samples\configs\deepstream-app\config_mux_source4.txt. mux-config-properties.
2. It gets the channel order from the source-id. When you get a frame from source-x, send it to a fixed position for display.

  1. The default value of max-same-source-frames is 1. Can I understand that NvDsFrameMetaList in NvDsBatchMeta—> NvDsFrameMetaList only outputs one frame of data per channel in each batch, and packs them together.
    For example, there are three channels: if max-same-source-frames=1 is set, then the batch passed down is source0, source1, and source2. A frame of data is combined into a batch; if I set max-same-source-frames =2, then the batch passed down is source0, source1, source2, source0, source1, source2. Is the above understanding correct?
  2. If it is correct, if the max-same-source-frames is set too large, the GPU occupancy rate will be high, and the real-time performance is good? So if the setting value is too large, what are the disadvantages?

No. This is the maximum frame of each source. The actual situation may be less than this value.
About your requirements, you can create multiple buffers to save the frame from each source. This ensures that the frames in each buffer are from the same source and sorted.

Q1:“The actual situation may be less than this value.” Is the frame size random? As long as it does not exceed the “max-same-source-frames” value set?

Q2:It is mentioned in the nvmultistreamtiler plugin that "obtained from NvDsBatchMeta and NvDsFrameMeta in row-major order (starting from source 0, left to right across the top row, then across the next row). Each source frame is scaled to the corresponding location in the tiled output buffer.”
For example ,if the channel size is 3, max-same-source-frames=6, and the size of NvDsBatchMeta—> the actual value of NvDsFrameMetaList is 5(source0, source1, source2, source0,source1),then It gets the channel order from the source-id, send it to a fixed position for display. In this way, multi-channel data may not be synchronized up.One frame of source2 data is missing.
How does the nvmultistreamtiler plugin avoid this out of sync, possibly flickering problem?

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

Q1: Yes. This is only a limit on the maximum value.
Q2: Each frame from the source can be individually controlled through a buffer. There is no need to synchronize between different sources.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.