Deepstream nvmsg-broker error with kafka

Please provide complete information as applicable to your setup.

• Hardware Platform GPU
• DeepStream Version 6
• TensorRT Version 8
• NVIDIA GPU Driver Version 470
• Issue Type questions

Hi there
I’m building an app and trying to use kafka to send message out. I use python and follow the deepstream-test4 example.

However, this error occurs.

Error: gst-library-error-quark: GStreamer encountered a general supporting library error. (1): gstnvmsgbroker.cpp(523): legacy_gst_nvmsgbroker_render (): /GstPipeline:pipeline0/GstNvMsgBroker:nvmsg-broker:
failed to send the message. err(1)
Exiting app

I guess it is because I give a large string to the struct. These are my steps:

  1. add a field in nvdsmeta_schema.h to save the croped image array.
typedef struct NvDsPersonObject {
  gchar *gender;    /**< Holds a pointer to the person's gender. */
  gchar *hair;      /**< Holds a pointer to the person's hair color. */
  gchar *cap;       /**< Holds a pointer to the type of cap the person is
                     wearing, if any. */
  gchar *apparel;   /**< Holds a pointer to a description of the person's
                     apparel. */
  guint age;        /**< Holds the person's age. */
  gcha *image;
} NvDsPersonObject;

  1. I modify codes in bindschema.cpp and build the bindings. It works fine.
  2. I get the croped image array by following deepstream-imagedata-multistream-redaction and then convert the array into json format:
                lists = frame_copy.tolist()
                json_str = json.dumps(lists)
  1. I pass the array to the field I add like this:
def generate_person_meta(data, array):
    obj = pyds.NvDsPersonObject.cast(data)
    obj.age = 24
    obj.cap = "cap"
    obj.hair = "hair"
    obj.gender = "male"
    obj.apparel = "formal"
    obj.image = array
    return obj
  1. I run the app and kafka listner could receive the message correctly. The problem occurs after I run the app for a few seconds.

This confused me. Why it runs normal for few seconds and fails. If there is something wrong, shouldn’t it fail at the very beginning?
This error disappears if I give the field a short “string”. I also checked the type of gchar *param, it’s a int pointer and my array is in json string format. I also looked into the other files in bindings/include/bind and didn’t figure out why.

Any ideas to solve this? Or is there a better way to pass array through kafka?

Please help me, thanks a lot

Sorry for the late response, is this still an issue to support? Thanks

Yes

Q1: I’d like to ask this question in another way: How to send an numpy array out by nvmsgbroker.

Q2: I checked deepstream-image-meta-test and learned we can add the detected object into NVDS_CROP_IMAGE_META . It one can do this, why the python sample deepstream-imagedata-multistream use get_nvds_buf_surface and cv2 to do so? What’s the difference?

Sorry for the late.
Can you run well through c++ app for your trying?

Nope, I didn’t try c++, however, using the latest sdk 6.0.1 do fix the error. I’v done nothing changed with my codes but pull 6.0.1 deepstream docker and it now can run well.

I learned about pybind during this period and to my first question: how can I add a numpy array in message. For now, I set the array to list then to string. Is there a way to directly add numpy array or bytearray to message meta ?

You can not add numpy arrary into message meta directly.

I’m aware of this now.

As I looking for resolution, I read about pybind array. I know this is not a deepstream issue. But as the output of my inference is ndarray, I’d like to find a better way to add it to message rather than to convert into string. Hope there is a way to achieve this in the future.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.