Advices for generating custom payload when using a custom model

• Hardware Platform (Jetson / GPU)
Jetson Orin Nano 8GB
• DeepStream Version
7.1
• JetPack Version (valid for Jetson only)
6.1
• TensorRT Version
10.3

Hello, and thanks in advance for taking the time to read this post.

I started working on a copy of the sample app deepstream-test-4. I have an .enginemodel that detects a single class, anomaly. My next step would be to use my custom model with a single class, and send the bounding boxes, class ID, tracking ID and the images (luckily the README file mentions it!).

The docs mentions there’s two ways to generate the payload:

  1. NVDS_EVENT_MSG_META (NvDsEventMsgMeta) type metadata attached to the buffer as user metadata of frame meta. For the batched buffer, metadata of all objects of a frame must be under the corresponding frame meta. This is the default option.
  2. By parsing the NVDS_FRAME_META (NvDsFrameMeta) type and NVDS_OBJECT_META (NvDsObjectMeta) type in Gst buffer and available fields in these metadata types are used to create message payload based on schema type. To use this option, set gst property msg2p-newapi = true.

What option do you suggest for my specific use-case?

If you want the image to be sent along with the metadata, it is recommended to use the --msg2p-meta 1 parameter, which has been implemented in deepstream-test4.

You can refer to the following code snippets, they are all in deepstream-test4

if (msg2p_meta == 0) {        //generate payload using eventMsgMeta
      gst_pad_add_probe (osd_sink_pad, GST_PAD_PROBE_TYPE_BUFFER,
          osd_sink_pad_buffer_metadata_probe, NULL, NULL);
    } else {                //generate payload using NVDS_CUSTOM_MSG_BLOB
      gst_pad_add_probe (osd_sink_pad, GST_PAD_PROBE_TYPE_BUFFER,
          osd_sink_pad_buffer_image_probe, (gpointer) obj_ctx_handle, NULL);
    }
message_data = g_strconcat("image;jpg;", width, "x", height, ";",
                                       ts, ";", encoded_data, ";", NULL);
            STOP_PROFILE("Base64 Encode Time ");
            msg_custom_meta->size = strlen(message_data);
            msg_custom_meta->message = g_strdup(message_data);
            if (user_event_meta_custom) {
              user_event_meta_custom->user_meta_data = (void *)msg_custom_meta;
              user_event_meta_custom->base_meta.meta_type =

Thank you, I enabled that option in the custom yml file:

source:
  location: /home/concept/example.h264

streammux:
  batch-size: 1
  batched-push-timeout: 40000
  width: 640
  height: 480

msgconv:
  #If you want to send images, please set the "payload-type: 1" and "msg2p-newapi: 1"
  payload-type: 1
  msg2p-newapi: 1
  frame-interval: 30

msgbroker:
  proto-lib: /opt/nvidia/deepstream/deepstream/lib/libnvds_kafka_proto.so
  conn-str: localhost;9092
  topic: jackal-detection
  sync: 0

sink:
  sync: 0

# Inference using nvinfer:
primary-gie:
  plugin-type: 0
  config-file-path: dstest4_pgie_config.txt

This is an example of a message I receive with kafka. The model detects correctly five objects:

{
  "version" : "4.0",
  "id" : "690",
  "@timestamp" : "2025-01-22T13:46:49.502Z",
  "sensorId" : "CAMERA_ID",
  "objects" : [
    "18446744073709551615|81.6614|351.425|153.51|453.641|broken fence\r",
    "18446744073709551615|447.621|67.9785|545.45|162.954|broken fence\r",
    "18446744073709551615|43.8999|76.664|164.244|207.881|broken fence\r",
    "18446744073709551615|222.509|64.2772|350.248|209.518|broken fence\r",
    "18446744073709551615|225.992|305.394|352.659|470.612|broken fence\r"
  ]
}

As far as I know, the object IDs are identical because there’s no tracker yet, I will add it later. In case you’re wondering that “broken fence” is just a fixed string I add on runtime on the field type of my own custom object:

typedef struct NvDsAnomalyObject {
  gchar *type;
} NvDsAnomalyObject;

(I temporarily added the struct in the deepstream-test4 source code; was I supposed to put it into sources/libs/nvmsgconv/deepstream_schema/deepstream_schema.h ?)

Having said that, I was expecting to see the field containing the image encoded in base64. Any help on this?

If you modify dstest4_msgconv_config.[yml|txt], you need to modify deepstream_schema.h, which is usually not necessary.

Is broken fence the label of your model?

    ss << obj_meta->object_id << "|" << left << "|" << top
       << "|" << left + width << "|" << top + height
       << "|" << obj_meta->obj_label;

The first object of every 30 frames will be encoded as base64 and sent to Kafka.

if (is_first_object && !(frame_number % frame_interval)) {
        /* Frequency of images to be send will be based on use case.
         * Here images is being sent for first object every
         * frame_interval(default=30).
         */

In addition, you can recompile test4(make and check whether there is a file dump to determine whether the code is executed correctly.

diff --git a/sources/apps/sample_apps/deepstream-test4/Makefile b/sources/apps/sample_apps/deepstream-test4/Makefile
index 32a79e7..37f74c2 100755
--- a/sources/apps/sample_apps/deepstream-test4/Makefile
+++ b/sources/apps/sample_apps/deepstream-test4/Makefile
@@ -28,6 +28,8 @@ ifeq ($(TARGET_DEVICE),aarch64)
   CFLAGS:= -DPLATFORM_TEGRA
 endif
 
+CFLAGS += -DENABLE_DUMP_FILE
+
 C_SRCS:= $(wildcard *.c)
 CPP_SRCS:= $(wildcard *.cpp)
#ifdef ENABLE_DUMP_FILE
            gsize size = 0;
            snprintf(fileObjNameString, 1024, "%s_%d_%d_%s.jpg", ts,
                     frame_number, frame_meta->batch_id, obj_meta->obj_label);
            guchar *decoded_data = g_base64_decode(encoded_data, &size);
            fp = fopen(fileObjNameString, "wb");
            if (fp) {
              fwrite(decoded_data, size, 1, fp);
              fclose(fp);
            } else {
              g_printerr("Could not open file!\n");
            }
            g_free(decoded_data);
#endif
1 Like

broken fence is just the field type of the struct I defined on deepstream-test4.c:

typedef struct NvDsAnomalyObject {
  gchar *type;
} NvDsAnomalyObject;

then I handled the object in the functions:

meta_copy_func (by checking srcMeta->objType == NVDS_OBJECT_TYPE_CUSTOM)
meta_free_func
generate_event_msg_meta (by checking class_id == PGIE_CLASS_ID_ANOMALY, which I defined)
pgie_src_pad_buffer_probe

I’m aware this is not the correct way one should treat custom objects and payloads, but I can’t figure out what the good way is by reading the docs…, can you point out how I can do that, or provide a reference?

I enabled the dump file as suggested

Modify the code and configuration file as follows then recompile.

diff --git a/sources/apps/sample_apps/deepstream-test4/deepstream_test4_app.c b/sources/apps/sample_apps/deepstream-test4/deepstream_test4_app.c
index b292853..cbaae99 100755
--- a/sources/apps/sample_apps/deepstream-test4/deepstream_test4_app.c
+++ b/sources/apps/sample_apps/deepstream-test4/deepstream_test4_app.c
@@ -535,13 +535,14 @@ osd_sink_pad_buffer_image_probe (GstPad * pad, GstPadProbeInfo * info,
             START_PROFILE;
             encoded_data = g_base64_encode(enc_jpeg_image->outBuffer, enc_jpeg_image->outLen);
             generate_ts_rfc3339 (ts, MAX_TIME_STAMP_LEN);
+            gchar *anomaly_object = g_strdup_printf("%s", "your struct to string like broken fence... ");
             width = g_strdup_printf("%f", obj_meta->detector_bbox_info.org_bbox_coords.width);
             height = g_strdup_printf("%f", obj_meta->detector_bbox_info.org_bbox_coords.height);
             /* Image message fields are separated by ";".
              * Specific Format:  "image;image_format;image_widthximage_height;time;encoded data;"
              * For Example: "image;jpg;640x480;2023-07-31T10:20:13;xxxxxxxxxxx"
              */
-            message_data = g_strconcat("image;jpg;", width, "x", height, ";", ts, ";", encoded_data, ";", NULL);
+            message_data = g_strconcat(anomaly_object, ";", "image;jpg;", width, "x", height, ";", ts, ";", encoded_data, ";", NULL);
             STOP_PROFILE("Base64 Encode Time ");
             msg_custom_meta->size = strlen(message_data);
             msg_custom_meta->message = g_strdup(message_data);
@@ -571,6 +572,7 @@ osd_sink_pad_buffer_image_probe (GstPad * pad, GstPadProbeInfo * info,
             }
             g_free(decoded_data);
 #endif
+            g_free(anomaly_object);
             g_free(encoded_data);
             g_free(message_data);
             g_free(width);
diff --git a/sources/apps/sample_apps/deepstream-test4/dstest4_config.yml b/sources/apps/sample_apps/deepstream-test4/dstest4_config.yml
index e5ce843..abdf349 100755
--- a/sources/apps/sample_apps/deepstream-test4/dstest4_config.yml
+++ b/sources/apps/sample_apps/deepstream-test4/dstest4_config.yml
@@ -21,14 +21,15 @@ streammux:
 
 msgconv:
   #If you want to send images, please set the "payload-type: 1" and "msg2p-newapi: 1"
-  payload-type: 0
-  msg2p-newapi: 0
-  frame-interval: 30
+  payload-type: 1
+  msg2p-newapi: 1
+  frame-interval: 3
 
 msgbroker:
+  enable: 1
   proto-lib: /opt/nvidia/deepstream/deepstream/lib/libnvds_kafka_proto.so
-  conn-str: <host>;<port>
-  topic: <topic>
+  conn-str: localhost;9092;quick-test
+  topic: quick-test
   sync: 0
 
 sink:

start kafka server , create topic and monitor the topic.

docker run --rm --name kafka-test  -p 9092:9092 apache/kafka:3.7.0
docker exec -it kafka-test /opt/kafka/bin/kafka-topics.sh --create --topic quick-test --bootstrap-server localhost:9092
docker exec -it kafka-test /opt/kafka/bin/kafka-console-consumer.sh --topic quick-test --from-beginning --bootstrap-server localhost:9092

deepstream-test4 sent the expected data,including base64-encoded image and custom structure

    "18446744073709551615|843.888|476.3|1067.2|606.403|car",
    "18446744073709551615|555.11|438.81|643.408|525.774|car",
    "18446744073709551615|1142.21|489.083|1424.63|612.898|car",
    "18446744073709551615|617.661|479.333|717.363|562.078|car"
  ],
  "customMessage" : [
    "your struct to string like broken fence... ;image;jpg;19.804108x32.191082;2025-01-24T10:06:36.234Z;/9j/2wCEAAYEBQYFBAYGBQYHBwYIChAKCgkJChQODwwQFxQYGBcUFhYaHSUfGhsjHBYWICwgIyYnKSopGR8tMC0oMCUoKSgBBwcHCggKEwoKEygaFhooKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKCgoKP/EAaIAAAEFAQEBAQEBAAAAAAAAAAABAgMEBQYHCAkKCxAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6AQADAQEBAQEBAQEBAAAAAAAAAQIDBAUGBwgJCgsRAAIBAgQEAwQHBQQEAAECdwABAgMRBAUhMQYSQVEHYXETIjKBCBRCkaGxwQkjM1LwFWJy0QoWJDThJfEXGBkaJicoKSo1Njc4OTpDREVGR0hJSlNUVVZXWFlaY2RlZmdoaWpzdHV2d3h5eoKDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uLj5OXm5+jp6vLz9PX29/j5+v/AABEIACAAFAMAIgABEQECEQH/2gAMAwAAARECEQA/AO2aE7svOxx/DT2C7gydfellUg9MVGNx6Y/E0rFtWHMZCckZNNw/92omW7z8k8Kr6Gk23v8Az8wflT5RXBbeK+t45p2kDFedrYqhPotruJ+1TID28w/41XtdS0vXZ5Gu3NkYv3ZiJx0Bo/svSJ5NscryqRnIPFXym1Wzegp0WwB+a8lJ/wCup/xpP7F0/wD5+5f+/p/xp3/CGac/zFp+fQ0f8IVpv964/OixiP/Z;"
  ]
}
1 Like

Here is an example approach, you can serialize any structure you want and put it into NVDS_CUSTOM_MSG_BLOB

Thank you for the example it clarified a lot!

I’m interested in sending the whole image rather than a cropped part of the image, is that possible?

Yes, no problem, please refer to the examples of saving image in /opt/nvidia/deepstream/deepstream-7.1/sources/apps/sample_apps/deepstream-image-meta-test/deepstream_image_meta_test.c, they are similar

Thank you again, I believe that example shows how to save images, but that’s not what I’m looking for. I’m looking for a way to acquire the whole image and then send it in base64, by modifying this part of the code:

[...]

while (usrMetaList != NULL) {
          NvDsUserMeta *user_event_meta_custom =
                    nvds_acquire_user_meta_from_pool (batch_meta);
          NvDsCustomMsgInfo *msg_custom_meta =
                   (NvDsCustomMsgInfo *) g_malloc0 (sizeof (NvDsCustomMsgInfo));

          NvDsUserMeta *usrMetaData = (NvDsUserMeta *) usrMetaList->data;
          if (usrMetaData->base_meta.meta_type == NVDS_CROP_IMAGE_META) {
            NvDsObjEncOutParams *enc_jpeg_image =
                (NvDsObjEncOutParams *) usrMetaData->user_meta_data;
            START_PROFILE;
            encoded_data = g_base64_encode(enc_jpeg_image->outBuffer, enc_jpeg_image->outLen);
            generate_ts_rfc3339 (ts, MAX_TIME_STAMP_LEN);
            width = g_strdup_printf("%f", obj_meta->detector_bbox_info.org_bbox_coords.width);
            height = g_strdup_printf("%f", obj_meta->detector_bbox_info.org_bbox_coords.height);
            /* Image message fields are separated by ";".
             * Specific Format:  "image;image_format;image_widthximage_height;time;encoded data;"
             * For Example: "image;jpg;640x480;2023-07-31T10:20:13;xxxxxxxxxxx"
             */
            message_data = g_strconcat("image;jpg;", width, "x", height, ";", ts, ";", encoded_data, ";", NULL);
            STOP_PROFILE("Base64 Encode Time ");
            msg_custom_meta->size = strlen(message_data);
            msg_custom_meta->message = g_strdup(message_data);
            if (user_event_meta_custom) {
               user_event_meta_custom->user_meta_data = (void *) msg_custom_meta;
               user_event_meta_custom->base_meta.meta_type = NVDS_CUSTOM_MSG_BLOB;
               user_event_meta_custom->base_meta.copy_func =
                   (NvDsMetaCopyFunc) meta_copy_func_custom;
               user_event_meta_custom->base_meta.release_func =
                   (NvDsMetaReleaseFunc) meta_free_func_custom;
               nvds_add_user_meta_to_frame (frame_meta, user_event_meta_custom);
            } else {
              g_print ("Error in attaching event meta custom to buffer\n");
            }

[...]

So the question is: which object/variable contains the entire image?

This does not make any difference. The above example shows how to save the entire image, while images or objects are saved to user_meta of type NVDS_CROP_IMAGE_META.

Could you please elaborate more on that?

For nvds_obj_enc_process, when the parameter isFrame is true, the current batch will be saved as images and attached to the user_meta of type NVDS_CROP_IMAGE_META

At the same time, set saveImg to FALSE, the image will not be saved as a file, but saved in the outBuffer member of NvDsObjEncOutParams

NvDsObjEncUsrArgs frameData = { 0 };
 /* Preset */
 frameData.isFrame = 1;

frameData.saveImg = FALSE;
NvDsObjEncOutParams *enc_jpeg_image =
                (NvDsObjEncOutParams *) usrMetaData->user_meta_data;

            snprintf (fileObjNameString, FILE_NAME_SIZE, "%s_%d_%d_%d_%s.jpg",
                osd_string, frame_number, frame_meta->batch_id, num_rects,
                obj_meta->obj_label);
            /* Write to File */
            file = fopen (fileObjNameString, "wb");
            fwrite (enc_jpeg_image->outBuffer, sizeof (uint8_t),
                enc_jpeg_image->outLen, file);

For test4, just encode the buffer to base64, There is nothing more to explain

1 Like

Thank you very much, it works now! :)

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.