Bus error (core dumped) when use cv::imencode in deepstream test5 app custom

Hi all
This is my step to custom deepstream test5 app for generate base64 from opencv mat frame and GstBuffer:

  1. Creat deepstream_nvdsanalytics_meta.cpp deepstream_nvdsanalytics_meta.cpp (6.6 KB)
    In this file i creat a function enc_object2base64(GstBuffer * buf){…} to read GstBuffer and return base64 string.

  2. Edit Makefile to use OpenCV and deepstream_nvdsanalytics_meta.cpp in deepstream_test5_app_main.c Makefile (3.1 KB).

  3. Custom deepstream_test5_app_main.c deepstream_test5_app_main.c (52.0 KB)

extern char* enc_object2base64(GstBuffer * buf);

static void
generate_event_msg_meta (gpointer data, gint class_id, gboolean useTs,
    GstClockTime ts, gchar * src_uri, gint stream_id, guint sensor_id,
    NvDsObjectMeta * obj_params, float scaleW, float scaleH,
    NvDsFrameMeta * frame_meta, GstBuffer * buf)
{
  ...
  meta->videoPath = g_strdup(enc_object2base64(buf));
  ...
}

Problem: When i set source in config file is an uri=file:/…/…/test/video-test.mp4 or uri=rtsp://xxx.yyy.zzz:abcd/axis-media/media.amp?videocodec=h264 so project working well. But with source is an: uri=rtsp://xxx.xxx.xxx:yyyy/stream/jpeg i get error

Bus error (core dumped)

in enc_object2base64(GstBuffer * buf){…} at line:

cv::imencode(“.jpg”,bgr_frame,encode_buffer);

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI
type=3
#uri=file:/…/…/test/video-test.mp4
#uri=rtsp://xxx.yyy.zzz:abcd/axis-media/media.amp?videocodec=h264
uri=rtsp://xxx.xxx.xxx:yyyy/stream/jpeg
num-sources=1
gpu-id=0
nvbuf-memory-type=0

So I think the problem is with the memory of the cv :: imencode function and the memory of the decode mjpeg video stream, help me
this is my old post How to convert object image to base64 but it not clear.

• Hardware Platform Jetson
• DeepStream 5.0
• JetPack Version 4.4
• Issue Type bugs

The key is to add mjpeg rtsp support with deepstream-app. With uridecodebin, we can not guarantee the HW decoder is used.

1 Like

yes i know, but why when i command out line

cv::imencode(“.jpg”,bgr_frame,encode_buffer);

so project working well

According to the pipeline graph you dump in How to convert object image to base64 - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums, there is no source in your pipeline, that means there is no input data to the pipeline. So there is no data in bgr_frame too.

sorry, you can check again with new graph, because graph in old post is wrong

uridecodebin is not connected either. You can replace the h264 rtsp source and dump the graph, there should be uridecodebin connected in the pipeline.

1 Like

Graph when command out cv::imencode

Graph when use h264 rtsp stream and uncommand cv::imencode

you right, in your opinion what should i do, but i dont understand why when i command out cv::imencode so have uridecodebin

when i use h264 uri so terminal print:

**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)
** INFO: <bus_callback:181>: Pipeline ready

** INFO: <bus_callback:167>: Pipeline running

Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
** INFO: <bus_callback:167>: Pipeline running

KLT Tracker Init
**PERF: 37.64 (35.59)

and with mjpeg uri so terminal print:

**PERF: FPS 0 (Avg)
**PERF: 0.00 (0.00)
** INFO: <bus_callback:181>: Pipeline ready

** INFO: <bus_callback:167>: Pipeline running

Bus error (core dumped)

Bus error with gstreamer and opencv Are you think that is problem

1 Like

flow this recommend i try to change nvjpegdec to jpegdec in file deepstream_source_bin.c at function:

static void
decodebin_child_added (GstChildProxy * child_proxy, GObject * object,
    gchar * name, gpointer user_data)
{
  NvDsSrcBin *bin = (NvDsSrcBin *) user_data;
  NvDsSourceConfig *config = bin->config;
  if (g_strrstr (name, "decodebin") == name) {
    g_signal_connect (G_OBJECT (object), "child-added",
        G_CALLBACK (decodebin_child_added), user_data);
  }
  if ((g_strrstr (name, "h264parse") == name) ||
      (g_strrstr (name, "h265parse") == name)) {
      g_object_set (object, "config-interval", -1, NULL);
  }
  if (g_strrstr (name, "fakesink") == name) {
      g_object_set (object, "enable-last-sample", FALSE, NULL);
  }
  if (g_strrstr (name, "nvcuvid") == name) {
    g_object_set (object, "gpu-id", config->gpu_id, NULL);

    g_object_set (G_OBJECT (object), "cuda-memory-type",
        config->cuda_memory_type, NULL);

    g_object_set (object, "source-id", config->camera_id, NULL);
    g_object_set (object, "num-decode-surfaces", config->num_decode_surfaces,
        NULL);
    if (config->Intra_decode)
      g_object_set (object, "Intra-decode", config->Intra_decode, NULL);
  }
  if (g_strstr_len (name, -1, "omx") == name) {
    if (config->Intra_decode)
      g_object_set (object, "skip-frames", 2, NULL);
    g_object_set (object, "disable-dvfs", TRUE, NULL);
  }
  if (g_strstr_len (name, -1, "jpegdec") == name) {
    g_object_set (object, "DeepStream", TRUE, NULL);
  }
  if (g_strstr_len (name, -1, "nvv4l2decoder") == name) {
    if (config->Intra_decode)
      g_object_set (object, "skip-frames", 2, NULL);
#ifdef __aarch64__
    g_object_set (object, "enable-max-performance", TRUE, NULL);
#else
    g_object_set (object, "gpu-id", config->gpu_id, NULL);
    g_object_set (G_OBJECT (object), "cudadec-memtype",
        config->cuda_memory_type, NULL);
#endif
    g_object_set (object, "drop-frame-interval", config->drop_frame_interval, NULL);
    g_object_set (object, "num-extra-surfaces", config->num_extra_surfaces,
        NULL);

    /* Seek only if file is the source. */
    if (config->loop && g_strstr_len(config->uri, -1, "file:/") == config->uri) {
      NVGSTDS_ELEM_ADD_PROBE (bin->src_buffer_probe, GST_ELEMENT(object),
          "sink", restart_stream_buf_prob,
          (GstPadProbeType) (GST_PAD_PROBE_TYPE_EVENT_BOTH |
              GST_PAD_PROBE_TYPE_EVENT_FLUSH | GST_PAD_PROBE_TYPE_BUFFER),
          bin);
    }
  }
done:
  return;
}

but not solve the problem, can you help me

How can i do that in deepstream test5 app

@DaneLLL can you help me

Bus error with gstreamer and opencv - Jetson & Embedded Systems / Jetson TX2 - NVIDIA Developer Forums is for Jetson accelerated gstreamer. It is not deepstream. The method can not be used in deepstream pipeline.

Back to your question. Even after you commented out the imencode line, the pipeline is not correct for jpeg rtsp source.

1 Like

i check in camera setting so it only supports output format JPEG, MJPEG stream, i think it mean jpeg rtsp source

tell you why with this source in graph has no encoding_name, what relationship between this problem with cv::imencode

There is actually no data in GstBuffer when the source bin is not connected.