DeepStream 5.0 SDK Memory Leak

• Hardware Platform (Jetson / GPU)
JetsonNano
• DeepStream Version
DeepStream 5.0
• JetPack Version (valid for Jetson only)
JetPack 4.4.1[L4T 32.4.4]
• TensorRT Version
7.1.3.0
• Issue Type( questions, new requirements, bugs)
DeepStream 5.0 Memory Leak
• How to reproduce the issue ? (This is for bugs. Including which sample app is using, the configuration files content, the command line used and other details for reproducing)

Hi,

I was building an application with DeepStream 4.0 and then migrated to DS 5.0.
I am building an application using the DeepStream-app sample code.
I am receiving it in RTSP using an IP camera in MJPEG format.
This program is executed, the memory gradually increases, and the program ends every 1 hour and 10 minutes.

An error message as shown in the picture below occurs.

I’ve seen another person’s question with an error similar to the one above.
(DeepStream v5 bug | NvDufSurface memory leakage)

How can I catch this memory leak?

There is a known memory leak issue with deepstream-app, can you try this solution first? DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums

Hi,
I followed the process in the solution you gave me.
I replaced the libgstbase-1.0.so.0.1405.0 file, but I noticed that the memory was still accumulating.

Are you using DeepStream5.0.1? Are you using deepstream-app or your own app?

Yes I am using DeepStream 5.0.1 SDK.

I used QTcreator to add the codes of /opt/nivida/deepstream/deepstream-5.0/sources/apps/sample_apps/deepstream-app and modify it for my app.

Can deepstream-app be run directly and reproduce the memory consumption increasing?

Is the deepstream-app you are referring to the sample application built and provided by Nvidia?
Is it deepstream-app running with deepstream-app -c xxx.txt in Terminal window?

Yes. Yes.

Yes I thought it was a deepstream-app internal problem.
However, it was determined that there is no memory leak problem when running with DeepStream-app.

If the problem can not be reproduced with deepstream-app, it is hard to say the leak is caused by deepstream.

Even if I think about it, it’s hard to say that it’s a DeepStream-app problem.
There must be a problem when I apply the DeepStream-app SDK to my application, right?
I think it’s a similar type to the error message shown in the image I uploaded and the problem of someone else I linked to.
In the error message, there seems to be something wrong with the nvbufsurface API.
Is there any problem that can solve this?

Do you have customized codes as the (DeepStream v5 bug | NvDufSurface memory leakage) mentioned? If so, please upload your codes.

If you are using the original deepstream-app, DeepStream SDK FAQ - Intelligent Video Analytics / DeepStream SDK - NVIDIA Developer Forums can fix the known memory leak with deepstream 5.0.1. Or else, you need to tell us how to reproduce the problem with deepstream-app.

I call Nvbufsurface in the all_box_generated function.

Is this function properly configured in relation to NvBufSurface?

void Deepstream_Main::all_bbox_generated (AppCtx * appCtx, GstBuffer * buf, NvDsBatchMeta * batch_meta, guint index)
{
cv::Mat frame;
std::vector rect_vec;
std::vector tracking_result_vec;
bool isCount = false;

bool isSensing = false;
bool isSensing1 = false;
bool isSensing2 = false;
int direction = 0; // 0 : nothing, 1 : line1, 2 : line2, 3 : line1 & line2 // 0 : nothing, 1 : Enter, 2: Exit

guint num_objects[128];

GstMapInfo in_map_info;
NvBufSurface* surface = NULL;

NvDsFrameMeta *frame_meta = NULL;

//NvBufSurface* copy_surface = NULL;
//NvBufSurfaceCopy(surface,copy_surface);

NvDsDisplayMeta *display_meta = nvds_acquire_display_meta_from_pool(batch_meta);
NvOSD_RectParams *rect_params = display_meta->rect_params;
NvOSD_TextParams *text_params = display_meta->text_params;
 NvOSD_LineParams *line_params = display_meta->line_params;

memset (num_objects, 0, sizeof (num_objects));

if(gst_buffer_map(buf, &in_map_info, GST_MAP_READWRITE))
{
    surface = (NvBufSurface *) in_map_info.data;

    NvBufSurfaceMap(surface, -1, -1, NVBUF_MAP_READ_WRITE);
    NvBufSurfaceSyncForCpu(surface, -1, -1);

    //NvBufSurfaceSyncForDevice(copy_surface,0,0);
    //NvBufSurfaceCopy(surface, )

    for (NvDsMetaList * l_frame = batch_meta->frame_meta_list; l_frame != NULL; l_frame = l_frame->next)
    {
        frame_meta = (NvDsFrameMeta *)(l_frame->data);

        gint frame_width = (gint)surface->surfaceList[frame_meta->batch_id].width;
        gint frame_height = (gint)surface->surfaceList[frame_meta->batch_id].height;
        void *frame_data = surface->surfaceList[frame_meta->batch_id].mappedAddr.addr[0];
        size_t frame_step = surface->surfaceList[frame_meta->batch_id].pitch;

        frame = cv::Mat(frame_height, frame_width, CV_8UC4, frame_data, frame_step);
        cv::Mat detectionFrame = cv::Mat::zeros(frame_height, frame_width, CV_8UC1);
        cv::cvtColor(frame, frame, cv::COLOR_BGR2RGB);

        if(dM->setStart)
        {
            for (NvDsMetaList * l_obj = frame_meta->obj_meta_list; l_obj != NULL; l_obj = l_obj->next)
            {
                NvDsObjectMeta *obj = (NvDsObjectMeta *) l_obj->data;

                cv::Rect rect;
                TrackingBox box;
                box.type = 10;

                QMapIterator<QString, float> iter(dM->globalData->jobInfo->detectionClass);
                while(iter.hasNext())
                {
                    iter.next();
                    QString name = iter.key();
                    float confidence = 0;

                    if (obj->text_params.display_text == name
                            && obj->confidence >= confidence)
                    {
                        if (name == "Person" || name == "person"){ box.type = 0;}
                        else if (name == "Car"){ box.type = 1; }
                        else if (name == "Truck"){ box.type = 2; }
                        else if (name == "Bus"){ box.type = 3; }
                        else{ box.type = 3; }

                        rect.width = obj->rect_params.width;
                        rect.height = obj->rect_params.height;
                        rect.x = obj->rect_params.left;
                        rect.y = obj->rect_params.top;

                        box.box = rect;
                        box.confidence = obj->confidence;

                        rect_vec.push_back(box);
                    }
                }
            }
            tracking_result_vec = dM->hanaTracker(rect_vec);

            //Draw Object Information
            for(int i=0; i<tracking_result_vec.size(); i++)
            {
                // Class Name Start
                QString s_id = QString::number(tracking_result_vec[i].id);
                QString s_type = "Default"; //Type Of Object
                switch(tracking_result_vec[i].type)
                {
                case 0:
                    s_type = "Person";
                    break;
                case 1:
                    s_type = "Car";
                    break;
                case 2:
                    s_type = "Truck";
                    break;
                case 3:
                    s_type = "Bus";
                    break;
                default:
                    s_type = "Default";
                    break;
                }
                s_type += s_id;

                cv::putText(frame, s_type.toUtf8().constData(), tracking_result_vec[i].box.tl(), FONT_HERSHEY_COMPLEX, 0.7, cv::Scalar(255,255,255), 1, LINE_AA);
                cv::putText(frame, s_type.toUtf8().constData(), tracking_result_vec[i].box.tl(), FONT_HERSHEY_COMPLEX, 0.7, cv::Scalar(62,200,71), 1, LINE_AA);
                // Class Name End

                // StandardPoint
                //     vertical_horizontal
                // 0 : bottom_center
                // 1 : cetner_left
                // 2 : top_center
                // 3 : cetner_right
                // 4 : center_center
                // 5 : Left_Bottom = Rect(SignalPed)
                cv::Point2f center;
                cv::Rect rect;
                cv::Point2f rect_center; //ROI center point
                float distance_ratio;   //ROI weight by distance from camera (based X-axis)
                //float

                if (dM->globalData->jobInfo->standardPoint == 0)
                {
                    center.x = tracking_result_vec[i].box.x + (tracking_result_vec[i].box.width / 2);
                    center.y = tracking_result_vec[i].box.y + (tracking_result_vec[i].box.height);
                }
                else if (dM->globalData->jobInfo->standardPoint == 1)
                {
                    center.x = tracking_result_vec[i].box.x;
                    center.y = tracking_result_vec[i].box.y + (tracking_result_vec[i].box.height / 2);
                }
                else if (dM->globalData->jobInfo->standardPoint == 2)
                {
                    center.x = tracking_result_vec[i].box.x + (tracking_result_vec[i].box.width / 2);
                    center.y = tracking_result_vec[i].box.y;
                }
                else if (dM->globalData->jobInfo->standardPoint == 3)
                {
                    center.x = tracking_result_vec[i].box.x + (tracking_result_vec[i].box.width);
                    center.y = tracking_result_vec[i].box.y + (tracking_result_vec[i].box.height / 2);
                }
                else if (dM->globalData->jobInfo->standardPoint == 4)
                {
                    center.x = tracking_result_vec[i].box.x + (tracking_result_vec[i].box.width / 2);
                    center.y = tracking_result_vec[i].box.y + (tracking_result_vec[i].box.height / 2);
                }
                else if (dM->globalData->jobInfo->standardPoint == 5)
                {
                    center.x = tracking_result_vec[i].box.x + (tracking_result_vec[i].box.width / 2);
                    center.y = tracking_result_vec[i].box.y + (tracking_result_vec[i].box.height / 2);

                    //ROI Area : Rectangle
                    rect.x = tracking_result_vec[i].box.x + (tracking_result_vec[i].box.width * 1 / 12);
                    rect.y = tracking_result_vec[i].box.y + (tracking_result_vec[i].box.height * 1 / 12);
                    rect.width = tracking_result_vec[i].box.width * 1 / 12;
                    rect.height = tracking_result_vec[i].box.height * 8 / 12;

                    //Ellipse Center & ROI Center
                    rect_center.x = rect.x + (rect.width/2);
                    rect_center.y = rect.y + (rect.height/2);

                    // x's  Per 1 pixel => 0.015625 // 1280 =>
                    distance_ratio = (100.0 - (rect_center.x * (1.0/(frame_width/20.0))))/100.0;
                }
                // StandardPoint End

                // draw tracking item
                cv::Rect rect_trk = tracking_result_vec[i].box;
                cv::circle(frame, center, 2, cv::Scalar(0,128,255), 5);
                if (dM->globalData->jobInfo->standardPoint == 5)
                {
                  cv::ellipse(detectionFrame,rect_center,Size(rect.width/2,rect.height/2),0,0,360,cv::Scalar(255),-1);
                    cv::ellipse(frame,rect_center,Size(rect.width/2,rect.height/2),0,0,360,cv::Scalar(0,255,0),-1);

                    //Ellipse ROI (applied distance ratio)
                    cv::ellipse(detectionFrame,rect_center,Size(rect.width/2 * distance_ratio,rect.height/2* distance_ratio),0,0,360,cv::Scalar(255),-1);
                    cv::ellipse(frame,rect_center,Size(rect.width/2* distance_ratio,rect.height/2* distance_ratio),0,0,360,cv::Scalar(166,97,243),-1);

                    //Ellipse ROI
                 //cv::ellipse(detectionFrame,rect_center,Size(rect.width/2,rect.height/2),0,0,360,cv::Scalar(255),-1);
                    //cv::ellipse(frame,rect_center,Size(rect.width/2,rect.height/2),0,0,360,cv::Scalar(0,255,0),-1);


                    //Rect ROI
                    //cv::rectangle(detectionFrame, rect, cv::Scalar(255), -1);
                    //cv::rectangle(frame, rect, cv::Scalar(166,97,243), -1);
                }
                cv::rectangle(frame, rect_trk, cv::Scalar(166,97,243), 2);

                // RSTP Drowing
                if (dM->isRTSPOriginal == false)
                {
                    if(display_meta->num_rects < 10)
                    {
                        rect_params[(display_meta->num_rects)].left = tracking_result_vec[i].box.x;
                        rect_params[(display_meta->num_rects)].top = tracking_result_vec[i].box.y;
                        rect_params[(display_meta->num_rects)].width = tracking_result_vec[i].box.width;
                        rect_params[(display_meta->num_rects)].height = tracking_result_vec[i].box.height;
                        rect_params[(display_meta->num_rects)].border_width = 1;

                        NvOSD_ColorParams color;
                        color.alpha = 1.0;
                        color.blue = 1.0;
                        color.green = 0.0;
                        color.red = 1.0;
                        rect_params[(display_meta->num_rects)].border_color = color;

                        display_meta->num_rects++;
                    }
                    text_params[display_meta->num_labels].display_text = (gchar*)g_malloc0(6);
                    snprintf(text_params[display_meta->num_labels].display_text, 6, s_type.toUtf8().constData());
                    text_params[display_meta->num_labels].x_offset = tracking_result_vec[i].box.x;
                    text_params[display_meta->num_labels].y_offset = tracking_result_vec[i].box.y-10;

                    text_params[display_meta->num_labels].font_params.font_name = (gchar*)"Serif";

                    text_params[display_meta->num_labels].font_params.font_color.red = 1.0;
                    text_params[display_meta->num_labels].font_params.font_color.green = 1.0;
                    text_params[display_meta->num_labels].font_params.font_color.blue = 1.0;
                    text_params[display_meta->num_labels].font_params.font_color.alpha = 1.0;

                    text_params[display_meta->num_labels].font_params.font_size = 7;

                    text_params[display_meta->num_labels].set_bg_clr = 1;
                    text_params[display_meta->num_labels].text_bg_clr.red = 0.0;
                    text_params[display_meta->num_labels].text_bg_clr.green = 0.0;
                    text_params[display_meta->num_labels].text_bg_clr.blue = 0.0;
                    text_params[display_meta->num_labels].text_bg_clr.alpha = 1.0;
                    display_meta->num_labels++;
                }
                // Check - inside
                if (dM->globalData->jobInfo->type == "Vehicle")
                {
                    if (dM->globalData->jobInfo->roi->SensingVehicle(Point(center.x, center.y)))
                    {
                        isSensing = true;
                    }
                }
                else if (dM->globalData->jobInfo->type == "NonSignalPed")
                {
                    if (dM->globalData->jobInfo->roi->SensingNonSignalPed(Point(center.x, center.y)))
                    {
                        isSensing = true;
                    }
                }
            }
            // Check - inside End
            // end tracking
            if (dM->globalData->jobInfo->type == "SignalPed")
            {
                // check intersection
                for (int i = 0; i < dM->globalData->jobInfo->roi->lines.size(); i++)
                {
                    Mat bitwise = (dM->globalData->jobInfo->roi->lines[i] & detectionFrame);
                    //cv::imshow("bitwise", bitwise);
                    //cv::waitKey(0);
                    int count = cv::countNonZero(bitwise);
                    if (count != 0)
                    {
                        isSensing = true;
                        if (i == 0)
                        {
                            isSensing1 = true;
                        }
                        else if (i == 1)
                        {
                            isSensing2 = true;
                        }
                    }
                }
            }
            // Draw ROI
            //Draw Line End
            NvOSD_ColorParams roi_line_color;
            if (isSensing)
            {
                roi_line_color.alpha = 1.0;
                roi_line_color.blue = 0.0;
                roi_line_color.green = 1.0;
                roi_line_color.red = 0.0;
            }
            else
            {
                roi_line_color.alpha = 1.0;
                roi_line_color.blue = 0.0;
                roi_line_color.green = 0.0;
                roi_line_color.red = 1.0;
            }
            if (dM->globalData->jobInfo->type == "Vehicle")
            {
                //Draw Line
                for (int i = 0; i < dM->globalData->jobInfo->roi->rois[0].size() - 1; i++)
                {
                    if (dM->isRTSPOriginal == false)
                    {
                        line_params[display_meta->num_lines].x1 = dM->globalData->jobInfo->roi->rois[0][i].x;
                        line_params[display_meta->num_lines].y1 = dM->globalData->jobInfo->roi->rois[0][i].y;
                        line_params[display_meta->num_lines].x2 = dM->globalData->jobInfo->roi->rois[0][i + 1].x;
                        line_params[display_meta->num_lines].y2 = dM->globalData->jobInfo->roi->rois[0][i + 1].y;
                        line_params[display_meta->num_lines].line_width = 7;
                        line_params[display_meta->num_lines].line_color = roi_line_color;
                        display_meta->num_lines++;
                    }
                }
                dM->globalData->jobInfo->roi->DrawROI_Vehicle(frame, isSensing);
            }
            else if (dM->globalData->jobInfo->type == "SignalPed")
            {
                if (dM->isRTSPOriginal == false)
                {
                    for (int i = 0; i < dM->globalData->jobInfo->roi->rois.size(); i++)
                    {
                        for (int j = 0; j <dM->globalData->jobInfo->roi->rois[i].size() - 1; j++)
                        {
                            line_params[display_meta->num_lines].x1 = dM->globalData->jobInfo->roi->rois[0][i].x;
                            line_params[display_meta->num_lines].y1 = dM->globalData->jobInfo->roi->rois[0][i].y;
                            line_params[display_meta->num_lines].x2 = dM->globalData->jobInfo->roi->rois[0][i + 1].x;
                            line_params[display_meta->num_lines].y2 = dM->globalData->jobInfo->roi->rois[0][i + 1].y;
                            line_params[display_meta->num_lines].line_width = 7;
                            line_params[display_meta->num_lines].line_color = roi_line_color;
                            display_meta->num_lines++;
                        }
                    }
                }

                dM->globalData->jobInfo->roi->DrawROI_SignalPed(frame, isSensing1, isSensing2);
            }

            else if (dM->globalData->jobInfo->type == "NonSignalPed")
            {
                if (dM->isRTSPOriginal == false)
                {
                    for (int i = 0; i < dM->globalData->jobInfo->roi->rois.size(); i++)
                    {
                        int vector_size = dM->globalData->jobInfo->roi->rois[i].size();
                        for (int j = 0; j < vector_size; j++)
                        {
                            line_params[display_meta->num_lines].x1 = dM->globalData->jobInfo->roi->rois[0][i].x;
                            line_params[display_meta->num_lines].y1 = dM->globalData->jobInfo->roi->rois[0][i].y;
                            line_params[display_meta->num_lines].x2 = dM->globalData->jobInfo->roi->rois[0][(i + 1) % vector_size].x;
                            line_params[display_meta->num_lines].y2 = dM->globalData->jobInfo->roi->rois[0][(i + 1) % vector_size].y;
                            line_params[display_meta->num_lines].line_width = 7;
                            line_params[display_meta->num_lines].line_color = roi_line_color;
                            display_meta->num_lines++;
                        }
                    }
                }
                dM->globalData->jobInfo->roi->DrawROI_NonSignalPed(frame, isSensing);
            }
        }
        else
        {
        }
         if (dM->globalData->jobInfo->type == "SignalPed")
        {
            if (isSensing1 == true && isSensing2 == true)
            {
                direction = 3;
            }
            else if (isSensing1 == true && isSensing2 == false)
            {
                direction = 1;
            }
            else if (isSensing1 == false && isSensing2 == true)
            {
                direction = 2;
            }
            else if (isSensing1 == false && isSensing2 == false)
            {
                direction = 0;
            }
           //qDebug() << direction;
        }

        cv::resize(frame, frame, cv::Size(640,360),INTER_LINEAR);

        //Video output Save
        //dM->writer.write(frame);

        emit dM->image_Load_signals(frame, isSensing, direction, dM->countInLine, isCount);
        frame.release();
        detectionFrame.release();
        rect_vec.clear();
        tracking_result_vec.clear();
        NvBufSurfaceUnMap(surface, -1, -1);
    }
gst_buffer_unmap(buf, &in_map_info);
}

}

Can the memory leak be reproduced with original deepstream-app? If not, the leak is caused by the codes you modified. Can you upload all modifications? I can’t find where “dM” come from.

Are you referring to all the code organized in the project?
or
Is this code related to DeepStream-app?

You need to provide the codes which can let us to reproduce the leak.

Please make sure the code can prove that the leak is related to NvBufSurface. And the app can be built and run on our platform. Please remove the codes which has nothing to do with deepstream as much as possible.

Okay, thank you.
I will tell you after removing the code that is not related to the deepstream.