How to create a gstreamer plugin to extract the ROI

How to use ExtractFdFromNvBuffer and get_converted_mat to get frames?
How to leverage ${ds root}/sources/gst-plugins/gst-dsexample/gstdsexample.cpp as a plugin?

Env. Jetpack4.1.1, DeepStream3.0 and Xavier

1 Like

You can use APIs in nvbuf_utils.h. FYR, some samples are at

Hi DaneLLL,

Does this sample code can be used in deepstream-app.c or deepstream-test1.c?
How to get fd in NvBufferGetParams?
each frame can be get by same fd?

Thanks for your help.

You can enable dsexample in deepstream-test by following


For using deepstream-test1, yo need to modofy the code to link dsexample plugin:

... ! nvinfer ! 'video/x-raw(memory:NVMM),format=NV12' ! nvvidconv ! 'video/x-raw(memory:NVMM),format=RGBA' ! <b>dsexample</b> ! nvosd ! ...


I got following error messages:

~/deepstream_sdk_on_jetson/sources/apps/sample_apps/deepstream-test1$ make
cc -o deepstream-test1-app deepstream_test1_app.o `pkg-config --libs gstreamer-1.0`
deepstream_test1_app.o: In function `main':
deepstream_test1_app.c:(.text+0x9d8): undefined reference to `ExtractFdFromNvBuffer'
deepstream_test1_app.c:(.text+0x9e4): undefined reference to `NvBufferGetParams'
collect2: error: ld returned 1 exit status
Makefile:34: recipe for target 'deepstream-test1-app' failed
make: *** [deepstream-test1-app] Error 1

What steps I need to do?

While I have modified following files:

  1. According to gst-dsexample\README: Append [ds-example] session to dstest1_pgie_config.txt
  2. Add #include “/home/nvidia/deepstream_sdk_on_jetson/sources/includes/nvbuf_utils.h” to deepstream_test1_app.c
  3. Insert follow code to the main() of deepstream_test1_app.c in under-line.
  /* Use convertor to convert from NV12 to RGBA as required by nvosd */
  nvvidconv = gst_element_factory_make ("nvvidconv", "nvvideo-converter");

 [u]/* Create dsexample */
  dsexample = gst_element_factory_make ("dsexample", "gstdsexample");[/u]
  /* Set up the pipeline */
  /* we add all elements into the pipeline */
  gst_bin_add_many (GST_BIN (pipeline),
      source, h264parser, decoder, pgie,
      filter1, nvvidconv, filter2, <u>dsexample</u>, nvosd, sink, NULL);
  /* file-source -> h264-parser -> nvh264-decoder ->
   * nvinfer -> filter1 -> nvvidconv -> filter2 -> dsexample -> nvosd -> video-renderer */
  if (!gst_element_link_many (source, h264parser, decoder, pgie,
      filter1, nvvidconv, filter2, <u>dsexample</u>, nvosd, sink, NULL)) {
    g_printerr ("Elements could not be linked. Exiting.\n");
    return -1;
  /* Wait till pipeline encounters an error or EOS */
  g_print ("Running...\n");

        [u]g_signal_emit_by_name (sink, "pull-sample", &sample,NULL);
        caps = gst_sample_get_caps (sample);
        if (!caps)
            printf("could not get snapshot format\n");
        gst_caps_get_structure (caps, 0);
        buffer = gst_sample_get_buffer (sample);
        gst_buffer_map (buffer, &map, GST_MAP_READ);

        ExtractFdFromNvBuffer((void *), &dmabuf_fd);

       ret = NvBufferGetParams(dmabuf_fd, &parm);
       if (ret != -0) {
           printf ("**** error NvBufferGetParams()\n");

Not sure but you should put the code of ExtractFdFromNvBuffer() in gst_dsexample_transform_ip() and rebuild Not in deepstream_test1_app.c.

Modification in deepstream_test1_app.c is to link dsexample.


If our pipeline is file-source -> h264-parser -> nvh264-decoder ->
nvinfer -> filter1 -> nvvidconv -> filter2 -> dsexample -> nvosd -> video-renderer

Do you think this pipeline can be divided two parts? (the first part is processed by Xavier then forwarding the remained stream data and meta-data to another server for last part processing).

If it can, where is the best break point on pipeline?

Hi thhsiao,
For uploading data to another server, you may refer to deepstream-test4 sample. Please follow below steps to run it:
a. Set up a server on a Ubuntu PC
1 On the Ubuntu PC, do

$ git clone

2 Go to analytics_server_docker

$ cd deepstream_360_d_smart_parking_application/analytics_server_docker

3 Install Docker and Docker Compose(Dependencies is described in

### Dependencies

The application requires recent versions of [Docker]( and [Docker Compose]( to be installed in the machine.

4 Edit to fill in IP_ADDRESS to IP address of the Ubuntu PC and GOOGLE_MAP_API_KEY. By default you may not have the google map key, so simply edit it to


5 Run This step takes a while and it starts the server at last.

$ ./

b. On your Xavier, build and run deepstream-test4
1 Follow NVIDIA_DeepStream_SDK_on_Jetson_References to install software prerequisites and DeepStream SDK
2 Follow README at


3 Please note that you have to modify CONNECT_STRING in deepstream_test4_app.cpp before building it. Replace with IP address of the Ubuntu PC.

#define CONNECTION_STRING ";9092;metromind-start"