Transform_ip buffer of custom gstreamer plugin not what I expected

I implemented a custom gstreamer plugin with basetransform as parent and oriented myself on the dsexample.

I want to access the buffer sent by nvarguscamerasrc. From what I understand I can do this (in place) with the transform_ip method.
In this method I made a GstMapInfo from the GstBuffer object and wanted to just print the buffer with the following code:

static GstFlowReturn gst_myplugin_transform_ip(GstBaseTransform *trans, GstBuffer *buf) {
  GstMYPlugin *myplugin = GST_MYPLUGIN(trans);
  GstMapInfo in_map_info;
  std::memset(&in_map_info, 0, sizeof(in_map_info));
  if (!gst_buffer_map(buf, &in_map_info, GST_MAP_READ)) {
    g_printerr("Could not map input buffer to info object");
    return GST_FLOW_ERROR;
  }
  g_print("n_buffer: %d\n", gst_buffer_n_memory(buf));
  for (int i = 0; i < in_map_info.size / 32; i++) {
    for (int j = 0; j < 32; j++) {
      g_print("%x ", in_map_info.data[i * j]);
    }
    g_print("\n\n");
  }
  g_print("\n\n\n\n");
  GST_DEBUG_OBJECT(myplugin, "transform_ip");
  return GST_FLOW_OK;
}

But all I get from this is the following repeatedly (with every frame of the camera?):

n_buffer: 1
f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 f0 

f0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 80 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 e 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 80 0 0 3 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 0 0 0 0 0 0 0 

f0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 20 0 0 e 0 0 0 0 0 

I test it by using gst-launch-1.0 :
gst-launch-1.0 nvarguscamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1' ! myplugin ! fakesink

The caps that were negotiated are the following:

GST_ARGUS: Running with following settings:
   Camera index = 0                                                                                                   
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080                                                                                    
   seconds to Run    = 0 
   Frame Rate = 29,999999                                                                                             
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.                                                                          
CONSUMER: Producer has connected; continuing.

So I would expect a much different size and content of the buffer.

What is my problem here ?

Hi,
If you don’t need DeepStream SDK in your usecase. You can run

$ gst-launch-1.0 nvarguscmrasrc ! nvvidconv ! fakesink

And implement your logic in nvvidconv plugin. The plugin is open source and you can download it from:
https://developer.nvidia.com/embedded/linux-tegra
L4T Driver Package (BSP) Sources

If you use DeepStream SDK, please set bufapi-version=1 to nvarguscamerasrc.

Hi,
Thanks for the response.

I used bufapi-version=1 on nvarguscamerasrc and I now get the following error when executing

gst-launch-1.0 nvarguscamerasrc bufapi-version=1 ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1' ! myplugin ! omxh264enc ! qtmux ! filesink location=test.mp4 -e```

The error is this

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
H264: Profile = 66, Level = 40 
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3264 x 2464 FR = 21,000000 fps Duration = 47619048 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3264 x 1848 FR = 28,000001 fps Duration = 35714284 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1640 x 1232 FR = 29,999999 fps Duration = 33333334 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59,999999 fps Duration = 16666667 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120,000005 fps Duration = 8333333 ; Analog Gain range min 1,000000, max 10,625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 2 
   Output Stream W = 1920 H = 1080 
   seconds to Run    = 0 
   Frame Rate = 29,999999 
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 90 43 4 94 7f 0 0 0 





NvMMLiteVideoEncDoWork: Surface resolution (0 x 0) smaller than encode resolution (1920 x 1080)
VENC: NvMMLiteVideoEncDoWork: 4319: BlockSide error 0x4
Event_BlockError from 0BlockAvcEnc : Error code - 4
Sending error event from 0BlockAvcEncERROR: from element /GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0: GStreamer encountered a general supporting library error.
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-omx/omx/gstomxvideoenc.c(1331): gst_omx_video_enc_loop (): /GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0:
OpenMAX component in error state Bad parameter (0x80001005)
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS...
n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 10 4a 4 94 7f 0 0 0 





ERROR: from element /GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0: GStreamer encountered a general supporting library error.
Additional debug info:
/dvs/git/dirty/git-master_linux/3rdparty/gst/gst-omx/omx/gstomxvideoenc.c(2346): gst_omx_video_enc_handle_frame (): /GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0:
OpenMAX component in error state Bad parameter (0x80001005)
ERROR: from element /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstNvArgusCameraSrc:nvarguscamerasrc0:
streaming stopped, reason error (-5)

Do you know why this happens now?

Hi,
We have deprecated omx plugins. Please try nvv4l2h264enc.

If the issue is still present, please try

$ gst-launch-1.0 nvarguscamerasrc bufapi-version=1 ! 'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1' ! mx.sink_0 nvstreammux width=1920 height=1080 batch-size=1 name=mx ! dsexample ! fakesink

And then try

$ gst-launch-1.0 nvarguscamerasrc bufapi-version=1 ! 'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1' ! mx.sink_0 nvstreammux width=1920 height=1080 batch-size=1 name=mx ! myplugin ! fakesink

When executing your proposed gst-launch-1.0 pipeline

$ gst-launch-1.0 nvarguscamerasrc bufapi-version=1 ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)NV12, framerate=(fraction)30/1' ! dsexample ! fakesink

nvarguscamerasrc tells me that it fails to execute Capture session

Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Error generated. /dvs/git/dirty/git-master_linux/multimedia/nvgstreamer/gst-nvarguscamera/gstnvarguscamerasrc.cpp, execute:656 Failed to create CaptureSession

(gst-launch-1.0:12120): GStreamer-CRITICAL **: 09:33:03.737: gst_mini_object_set_qdata: assertion 'object != NULL' failed
Got EOS from element "pipeline0".
Execution ended after 0:00:00.002415931
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Hi,
It has to be linked with nvstreammux. I have corrected the pipelines. Please give it a try.

Thanks for the quick reply.

I tried both:

$ gst-launch-1.0 nvarguscamerasrc bufapi-version=1 ! 'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1' ! mx.sink_0 nvstreammux width=1920 height=1080 batch-size=1 name=mx ! dsexample ! fakesink

and

$ gst-launch-1.0 nvarguscamerasrc bufapi-version=1 ! 'video/x-raw(memory:NVMM),width=1920,height=1080,framerate=30/1' ! mx.sink_0 nvstreammux width=1920 height=1080 batch-size=1 name=mx ! myplugin ! fakesink

Both are running now.

BUT, when I use my plugin and I try to output the buffer of transform_ip (with the code in the original post) I only get:

n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 60 aa 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 10 ac 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 c0 ad 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 b0 a8 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 60 aa 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 10 ac 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 c0 ad 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 b0 a8 1 70 7f 0 0 0 





n_buffer: 1
0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 

0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 4 0 0 0 0 0 0 0 60 aa 1 70 7f 0 0 0 


Which still doesn’t look like the raw NV12 data that I am expecting.

How can I access the raw NV12 data?

Hi,
You can access the buffer through NvBufSurface APIs. Please refer to

/opt/nvidia/deepstream/deepstream-5.1/sources/gst-plugins/gst-dsexample/gstdsexample.cpp

So if I read this correctly. I need to read it into a NvBufSurface and then sync it to cpu accessible memory with NvBufSurfaceSyncForCpu then work on it and after that resync it to device with NvBufSurfaceSyncForDevice. I would need apply a very domain problem specific algorithm, that isn’t computationally expensive, to the NV12 data
Am I correct with this assumption?

@DaneLLL I figured out, that I need to map the buffer and sync it to cpu first, before I can access it.

If I sync it to CPU addressable memory and then back to the device again, would this still be no-copy, or is the buffer copied and has the associated time disadvantage?

Hi,
There is no memcpy in calling NvBufSurfaceSyncForCpu() and NvBufSurfaceSyncForDevice(). It is same buffer and synchronized between CPU and other hardware engines(GPU, NVENC, …).