How to get picture from nv12 (memory:NVMM) in nvbuf?

plantform: jetson nx
jetpack: 4.6
CUDA: 10.2
OPENCV: 3.4.1
I’m trying to get the RGB picture from the nvbuf, but I dont’t know how to deal with the dmabuf_fd.
The nvbuf from the gstpad_probe of element “nvvidconv” ,format NV12,video/x-raw(memory:NVMM).

static GstFlowReturn new_buffer(GstAppSink *appsink, gpointer user_data)
{
GstSample *sample = NULL;

g_signal_emit_by_name (appsink, "pull-sample", &sample,NULL);

if (sample)
{
    GstBuffer *buffer = NULL;
    GstCaps   *caps   = NULL;
    GstMapInfo map    = {0};
    int dmabuf_fd = 0;

    caps = gst_sample_get_caps (sample);
    if (!caps)
    {
        printf("could not get snapshot format\n");
    }
    gst_caps_get_structure (caps, 0);
    buffer = gst_sample_get_buffer (sample);
    gst_buffer_map (buffer, &map, GST_MAP_READ);

    ExtractFdFromNvBuffer((void *)map.data, &dmabuf_fd);

    gst_buffer_unmap(buffer, &map);

    gst_sample_unref (sample);
}
else
{
    g_print ("could not make snapshot\n");
}

return GST_FLOW_OK;

}

So,how could I get the picture in the nvbuf then save it.
Moreover, I need to convert NV12 to RGB, so how to make the format change in the gpu then download it cpu ?

Hi,
Please check the suggestion in
Gstreamer get the buf content from (memory:NVMM) form jetson - #7 by DaneLLL

So, how to get the nvbuf to cpu, and convert nv12 to RGBA ?
I need to download the nvbbuf to cpu.

Hi,
In gstreamer, you can use nvvidconv plugin to convert to RGBA. And then call dump_dmabuf() to dump the data. Please refer to
Dump YUV buffer with libargus - #9 by DaneLLL

There are also sample code in

/usr/src/jetson_multimedia_api

Can also call NvBufferTransform() for NV12 to RGBA conversion.

Hi,DaneLLL
I don’t konw how to convert the format by NvBufferTransform().
It’s just flip the picture.

Hi,
There are sample code in

/usr/src/jetson_multimedia_api

Yo may grep NvBufferTransform() and check the code.

For using gstreamer, you can use nvvidconv plugin

gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! 'video/x-raw(memory:NVMM),format=RGBA' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=NV12' ! ... 

In the example, the camera frames are converted to RGBA and then covert back to NV12.

Hi,
I have used the pipeline :
nvargscamerasrc → nvvidconv → nvtee name=t → queue → nvvidconv → videoconvert → appsink t. → queue → omxh264enc → h264parse → mpegtsmux → udpsink

The pipeline have two function, appsink output the rgb picture, udpsink output the video. But some something wrong in the pipeline, the pipeline was blocked.

Hi,
Please use tee and nvv4l2h264enc plugins. Here is a sample for reference:
Problems using gstreamer pipeline in openCV for recording video - #5 by DaneLLL

You can try and check if you can run it successfully. And refer to the pipeline to modify your pipeline.

1 Like

Hi,DaneLLL
I met a new problem unexpected.
my pipeline:
nvargscamerasrc → nvvidconv → capsfilter → tee name=t → queue → nvvidconv → capsfilter caps=“video/x-raw, width=1280, height=960, format=(string)NV12, framerate=(fraction)30/1” → videoconvert → capsfilter caps="video/x-raw, width=1280, height=960, format=(string)BGR, framerate=(fraction)30/1"→ appsink t. → queue → nvv4l2h264enc → h264parse → mpegtsmux → udpsink

The pipeline became unstable, cause the large things to do, I think.
Sometime report:
CONSUMER: ERROR OCCURRED
error-------------------
error from element nvsrc: TIMEOUT
Debugging information: Argus Error Status
GST_ARGUS: Cleaning up

or report:
error-------------------
error from element nvsrc: DISCONNECTED
Debugging information: Argus Error Status
CONSUMER: ERROR OCCURRED
(Argus) Error EndOfFile: Unexpected error in reading socket (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadCore(), line 266)
(Argus) Error EndOfFile: Receiving thread terminated with error (in src/rpc/socket/client/ClientSocketManager.cpp, function recvThreadWrapper(), line 368)
GST_ARGUS: Cleaning up
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 91)
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 91)
(Argus) Error InvalidState: Receive thread is not running cannot send. (in src/rpc/socket/client/ClientSocketManager.cpp, function send(), line 96)
(Argus) Error InvalidState: (propagating from src/rpc/socket/client/SocketClientDispatch.cpp, function dispatch(), line 91)
Segmentation fault (core dumped)

I don’t know why the pipeline became such fragile.

Hi,
The error shows the camera input is not stable:

error from element nvsrc: TIMEOUT
Debugging information: Argus Error Status

Please share more information about the camera. Do you use the camera board from our camera partner?

Hi,
I’m using my own driver board, not the nvidia.
But it work well without fetch RGB picture in the pipeline.
The pipeline will block for some reason when it runing for while, then is killed by itself.

OK,I found the reason for the block pipeline.
The cpu memory was running out of it maximum, in the pipeline.
I don’t understand why the pipeline can’t free the memory by themself .
The conversion and tee have some bug which cause the memory leakage.
You can try the pipeline below ,then watch the cpu memory in jtop.

gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM), width=(int)1280, height=(int)960, format=(string)NV12, framerate=(fraction)30/1’ ! queue ! tee name=t ! queue ! nvvidconv ! ‘video/x-raw, width=(int)1280, height=(int)960, format=(string)RGBA, framerate=(fraction)30/1’ ! appsink t. ! queue ! nvv4l2h264enc ! h264parse ! mpegtsmux ! udpsink

Hi,
It sounds like you don’t unref the buffers in appsink. If you don’t need to access the buffers in appsink, please use fakesink

GstFlowReturn
pic_sink (GstElement * appsink, userdata * ptr)
{
GstBuffer *buffer, *app_buffer;
GstSample *sample = NULL;
GstFlowReturn ret;
GstMapInfo info;

g_signal_emit_by_name(appsink, "pull-sample", &sample, &ret);
if(sample){
	printf("sample is getting now !\n");
    buffer = gst_sample_get_buffer (sample);
	app_buffer = gst_buffer_copy_deep (buffer);
}
else{
    g_print("sample is NULL \n");
    return GST_FLOW_ERROR;
}

if(! gst_buffer_map (app_buffer, &info, GST_MAP_WRITE)) {
	cout<<"buffer mapping error "<<endl;

}

if (ret != GST_FLOW_OK)
  {
	g_warning("push buffer failed ,error: %d !\n", ret);
  }

… copy the info.data to opencv Mat …

gst_sample_unref(sample);
gst_buffer_unmap(app_buffer, &info);
gst_buffer_unref(app_buffer);
return GST_FLOW_OK;

}

Above code is my appsink function.
I did unref in the sink, but nothing change.

Hi,
Looks like you miss this:

gst_buffer_unref(buffer);

Also you can eliminate appsink and keep only udpsink. See if there is no leak in this condition.

No, I have done before ,but it doesn’t work.
I am trying gstprobe now.
gst-launch-1.0 nvarguscamerasrc ! ‘video/x-raw(memory:NVMM), width=(int)1280, height=(int)960, format=(string)NV12, framerate=(fraction)30/1’ ! queue ! tee name=t ! queue ! nvvidconv ! ‘video/x-raw, width=(int)1280, height=(int)960, format=(string)RGBA, framerate=(fraction)30/1’ ! fakesink t. ! queue ! nvv4l2h264enc ! h264parse ! mpegtsmux ! udpsink

I insert a probe in the sinkpad of fakesink.
It also give error info but memory dosen’t overload.

static GstPadProbeReturn
pic_fetch (GstPad *pad,
GstPadProbeInfo *info,
VideoStream *ptr)
{
GstMapInfo map;
GstBuffer *buffer;
buffer = GST_PAD_PROBE_INFO_BUFFER (info);
if (buffer == NULL)
return GST_PAD_PROBE_OK;

buffer = gst_buffer_make_writable (buffer);
GstBuffer * app_buffer = gst_buffer_copy(buffer);

ptr->freq++;
if(ptr->freq == 3)
{
	ptr->freq = 0;
}
else{
	cout<<"skip the frame"<<endl;
	return GST_PAD_PROBE_OK;
}

// if (gst_buffer_map (buffer, &map, GST_MAP_WRITE)) {
// // cout<<"map info size : "<<gst_buffer_get_size(app_buffer)<<endl;
// }

GstMapInfo map_pic;
if (gst_buffer_map (app_buffer, &map_pic, GST_MAP_WRITE)) {
	cout<<"map info size : "<<gst_buffer_get_size(app_buffer)<<endl;
}

cv::Mat img(960 ,1280,  CV_8UC4, (char*)map_pic.data);
	if(!img.data)
	{
		cout<<"error while load image"<<endl;
		return GST_PAD_PROBE_DROP;
	}
	else{
		cout<<"load image succesfully"<<endl;
	}


cout<<"before unmap"<<endl;
gst_buffer_unmap(app_buffer, &map_pic);
gst_buffer_unref(app_buffer);
// gst_buffer_unref(buffer);
// GST_PAD_PROBE_INFO_DATA (info) = buffer;



return GST_PAD_PROBE_OK;

}

Above is my pronbe function.
And I get error message below:

(jo_fpv_node:11162): GStreamer-CRITICAL **: 18:15:28.021: gst_buffer_get_sizes_range: assertion ‘GST_IS_BUFFER (buffer)’ failed

(jo_fpv_node:11162): GStreamer-CRITICAL **: 18:15:28.021: gst_mini_object_unref: assertion ‘GST_MINI_OBJECT_REFCOUNT_VALUE (mini_object) > 0’ failed
skip the frame

(jo_fpv_node:11162): GStreamer-CRITICAL **: 18:15:28.055: gst_buffer_get_sizes_range: assertion ‘GST_IS_BUFFER (buffer)’ failed

(jo_fpv_node:11162): GStreamer-CRITICAL **: 18:15:28.056: gst_mini_object_unref: assertion ‘GST_MINI_OBJECT_REFCOUNT_VALUE (mini_object) > 0’ failed
skip the frame

(jo_fpv_node:11162): GStreamer-CRITICAL **: 18:15:28.088: gst_buffer_get_sizes_range: assertion ‘GST_IS_BUFFER (buffer)’ failed

(jo_fpv_node:11162): GStreamer-CRITICAL **: 18:15:28.088: gst_mini_object_unref: assertion ‘GST_MINI_OBJECT_REFCOUNT_VALUE (mini_object) > 0’ failed
map info size : 4915200
load image succesfully
before unmap

Hi,
Please refer to the code:
How to run RTP Camera in deepstream on Nano - #29 by DaneLLL
For accessing NvBuffer in appsink or prob function.