Get Frame from SurfaceList to gstreamer rtsp Server

Hi,
I would like to convert a frame from surfaceList (in RGBA) to YUV420. Then use YUV420 data with gst_buffer_fill for gstreamer rtsp server.

Here is what I have so far:

  1. I created intermediate_buffer (with YUV420)
    create_params.gpuId = 0;
    create_params.width = 1280;
    create_params.height = 720;
    create_params.size = 0;
    create_params.colorFormat = NVBUF_COLOR_FORMAT_YUV420;
    create_params.layout = NVBUF_LAYOUT_PITCH;
    create_params.memType = NVBUF_MEM_DEFAULT;
    NvBufSurfaceCreate(&intermediate_buffer, 1, &create_params)

  2. Setup transform parameters
    transform_params.src_rect = &src_rect;
    transform_params.dst_rect = &dst_rect;
    transform_params.transform_flag = NVBUFSURF_TRANSFORM_FILTER |
    NVBUFSURF_TRANSFORM_CROP_SRC |
    NVBUFSURF_TRANSFORM_CROP_DST;
    transform_params.transform_filter = NvBufSurfTransformInter_Default

  3. I use NvBufSurfTranform to transform ip_surf (RGBA) to intermediate_buffer
    NvBufSurfTransform (&ip_surf, intermediate_buffer, &transform_params);
    I am not sure if step 3 convert RGBA to YUV420

  4. Use NvBufferGetParams and try to get NvBuffer but I got 0 for n_planes. I would like to write it to .yuv and use jpeg decode to convert it to jpeg so I can verify that I get the frame.

    NvBufferGetParams (intermediate_buffer->surfaceList[0].bufferDesc, &dest_param);

    NvBuffer *myNvBuffer = (NvBuffer *)dest_param.nv_buffer;
    g_print(“planes %d\n”, myNvBuffer->n_planes);
    std::ofstream *out = new std::ofstream(“test.yuv”);
    write_video_frame(out, *myNvBuffer);

Am I on the right track or is there any other way to archive my goal?

Thanks.

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) Jetson AGX Xavier
• DeepStream Version 5.0
• JetPack Version (valid for Jetson only) 4.4 DP
• TensorRT Version
• NVIDIA GPU Driver Version (valid for GPU only)

Can you show the whole pipeline and where you want to convert the surface data?

My pipeline is
src → streammax → titler → gpie → videoconv → ds-example → my-rtsp-server-plugin->sink

In my-rtsp-server-plugin, I construct the pipeline as below
“( appsrc name=src ! omxh264 ! video/x-h264, profile=high ! rtph264pay name=pay0 pt=96 )”

Where will you use NvBufSurfTranform? Inside some plugin such as ds-example or use pad probe with some plugin?
As for your codes, steps 1~ 3 are OK.
Step 4 needs the “NVBUF_MEM_SURFACE_ARRAY” or “NVBUF_MEM_HANDLE” memory type, but the memType of intermediate_buffer you created is “NVBUF_MEM_DEFAULT”, so it may not work.

Seems you can refer to the gstdsexample source codes to check how to deal with NvBufSurface.

I use it in gst_dsexample_transform_ip function.

  1. First I get NvBufSurface from in_map_info.data;
    surface = (NvBufSurface *) in_map_info.data;
  2. Then set
    ip_surf = *surface;
    ip_surf.numFilled = ip_surf.batchSize = 1;
    ip_surf.surfaceList = &(surface->surfaceList[0];

And call
NvBufSurfTransform (&ip_surf, intermediate_buffer, &transform_params);

According to the documentation:
https://docs.nvidia.com/metropolis/deepstream/dev-guide/DeepStream_Development_Guide/baggage/nvbufsurface_8h.html#a2832a9d266a0002a0d1bd8c0df37625b

NVBUF_MEM_DEFAULT is NVBUF_MEM_SURFACE_ARRAY on Jetson.

Yes, you can get the hardware buffer memory.

How do I get it?
When I do this
NvBufferGetParams (intermediate_buffer->surfaceList[0].bufferDesc, &dest_param);
NvBuffer *myNvBuffer = (NvBuffer *)dest_param.nv_buffer;
g_print(“planes %d\n”, myNvBuffer->n_planes);

The n_planes is 0.

I am also confused about the memory concept.
On Jetson Xavier:
What is hardware memory? Is it the same as Cuda memory?
What about NVMM memory?

The memory type is NVBUF_MEM_SURFACE_ARRAY, so it is not Cuda memory.
Hardware memory has several types, it depends on the hardware which will handle the memory.
NVMM memory is specifically for Deepstream video memory.
If you want to check the planar information of the memory, you may try “intermediate_buffer->surfaceList[0].planeParams.num_planes”.
https://docs.nvidia.com/metropolis/deepstream/dev-guide/DeepStream_Development_Guide/baggage/structNvBufSurfacePlaneParams.html

Thank Fiona for helping.