How do I get Optical Flow metadata from a pad_bufer_probe method?

• Issue Type( questions, new requirements, bugs) Question
I am implementing a padbuferprobe method similar to the one in deepstream_user_metadata_app. This can be seen below.

static GstPadProbeReturn
osd_sink_pad_buffer_probe (GstPad * pad, GstPadProbeInfo * info,
    gpointer u_data)
{
  GstBuffer *buf = (GstBuffer *) info->data;
  NvDsMetaList * l_frame = NULL;
  NvDsMetaList * l_user_meta = NULL;
  NvDsUserMeta *user_meta = NULL;
  gchar *user_meta_data = NULL;
  int i = 0;

  NvDsBatchMeta *batch_meta = gst_buffer_get_nvds_batch_meta (buf);

  for (l_frame = batch_meta->frame_meta_list; l_frame != NULL;
      l_frame = l_frame->next) {
    NvDsFrameMeta *frame_meta = (NvDsFrameMeta *) (l_frame->data);
    /* Validate user meta */
    for (l_user_meta = frame_meta->frame_user_meta_list; l_user_meta != NULL;
        l_user_meta = l_user_meta->next) {
      user_meta = (NvDsUserMeta *) (l_user_meta->data);
      user_meta_data = (gchar *)user_meta->user_meta_data;

      if(user_meta->base_meta.meta_type == NVDS_USER_FRAME_META_EXAMPLE)
      {
        g_print("\n************ Retrieving user_meta_data array of 16 on osd sink pad\n");
        for(i = 0; i < USER_ARRAY_SIZE; i++) {
          g_print("user_meta_data [%d] = %d\n", i, user_meta_data[i]);
        }
        g_print("\n");
      }
    }
    frame_number++;
  }
  return GST_PAD_PROBE_OK;
}

In my program this buffer probe is attached to the src of nvof plugin so the NVDSOpticalFlowMeta should be attached. How to I get this metadata and print it. For example how would I get the optical flow vector at each 4x4 block for a specific frame?
To start I changed this line.

if(user_meta->base_meta.meta_type == NVDS_USER_FRAME_META_EXAMPLE)

to this

if(user_meta->base_meta.meta_type == NVDS_OPTICAL_FLOW_META)

I can confirm this if statement returns true but I don’t know how to gather the optical flow metadata from the user_meta. From what I can tell, the nvdsmeta.h doesn’t include NvDsOpticalFlowMeta so I don’t know how it relates to the other metadata structures. This also leads me to another question. Would I need to add an include statement at the beggining of my app like this to include the nvdsopticalflowmeta structure?

#include "nvds_opticalflow_meta.h"

Additional Information The app that I’m working on is a modification of the deepstream-nvof-test.c. My modified pipeline looks something like this .

uridecodebin -> nvstreammux -> nvof *-> tee -> queue2 -> nvofvisual -> tiler -> transform -> sink
                                         \_> queue1 -> nvmsgconv -> nvmsgbroker    

Once again the probe is attached to the src of the nvof plugin denoted by *

Any help or input would be much appreciated.

some inferences:

Do you have any examples for C code?
Also would I have to add this include statement at the beginning of the test app?

#include "nvds_opticalflow_meta.h"

Also for clarification, in this code you shared,

m.def("get_optical_flow_vectors",
      [](void *data) {

what parameter is passed into (void *data) when that method is called? Is it from user meta? Is it from Frame meta? Or would it be from frame_user_meta_list? I am still confused on where exactly I get the *data from.

no C version. it is a sample to parse that data, please refer to deepstream_python_apps/deepstream-opticalflow.py at master · NVIDIA-AI-IOT/deepstream_python_apps · GitHub

1 Like

The get optical flow vectors method call leaves me with one more question. I now understand how to get the data and how to add it to an array. I still am wondering how the DATA[i] array has stored values. So I understand for a 1280x720 video I would have 320 columns and 180 rows. This means I would have 57,600 blocks with motion vectors. I’m wondering whether the stored motion vectors for each block traverse the row first or the column first.

For a simpler example, lets imagine that I had a video that was 16x12 pixels which would then have 4 columns and 3 rows. Is the data stored with blocks being stored row first as shown in Example A below?
exampleA
Or would the data be stored with blocks traversing a column first as Example B below?
exampleB
This would be helpful to know so that I will understand exactly where in the video, the flow vectors are coming from. Without knowing this the data would be useless.

it is example A, data is saved by rows.

1 Like

Ok, thank you for the responses. This has been helpful in gaining a better understanding of probing optical flow metadata.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.