Memory leaks in DeepSteam 5 applications

There appears to be some problem with running cuda-memcheck with any deepstream app. If I run with NvDCF I see the following error:

cuda-memcheck ./deepstream-nvdsanalytics-test file:///opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-nvdsanalytics-sagar/sample_1080p_h264.mp4
========= CUDA-MEMCHECK
Now playing: file:///opt/nvidia/deepstream/deepstream/sources/apps/sample_apps/deepstream-nvdsanalytics-sagar/sample_1080p_h264.mp4,
gstnvtracker: Loading low-level lib at /opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_nvdcf.so
gstnvtracker: Optional NvMOT_RemoveStreams not implemented
gstnvtracker: Batch processing is ON
========= Internal Memcheck Error: Initialization failed
=========     Saved host backtrace up to driver entry point at error
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libcuda.so.1 [0x13e08c]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3d887a]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3cb9a0]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3d7bca]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3db8cf]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3dc03a]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3cf66c]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3bf16e]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x3f138c]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x37b82]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x38186]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 [0x39cd2]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 (cufftXtMakePlanMany + 0x63a) [0x4d2aa]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 (cufftMakePlanMany64 + 0xfd) [0x4e20d]
=========     Host Frame:/usr/local/cuda/lib64/libcufft.so.10 (cufftMakePlanMany + 0x193) [0x4acd3]
=========     Host Frame:/opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_nvdcf.so (_ZN5NvDCF10initializeEN2cv5Size_IiEEm + 0x1149) [0x610c9]
=========     Host Frame:/opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_nvdcf.so (_ZN12NvMOTContextC1ERK12_NvMOTConfigR20_NvMOTConfigResponse + 0x35f) [0x8688f]
=========     Host Frame:/opt/nvidia/deepstream/deepstream-5.0/lib/libnvds_nvdcf.so (NvMOT_Init + 0x3b) [0x87c2b]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_tracker.so (_ZN13NvTrackerProc18initTrackerContextEv + 0x29b) [0x52c45]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_tracker.so (_ZN13NvTrackerProc4initERK13TrackerConfig + 0xd1) [0x51627]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/gstreamer-1.0/deepstream/libnvdsgst_tracker.so [0x4c3b3]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstbase-1.0.so.0 [0x44270]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstbase-1.0.so.0 [0x44505]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x7a69b]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 (gst_pad_set_active + 0xe6) [0x7b116]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x58f0d]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 (gst_iterator_fold + 0x94) [0x6b874]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x59a16]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x5b95e]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x5bc8f]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 (gst_element_change_state + 0x3e) [0x5dd5e]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x5e499]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x3ba02]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 (gst_element_change_state + 0x3e) [0x5dd5e]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 (gst_element_change_state + 0x325) [0x5e045]
=========     Host Frame:/usr/lib/x86_64-linux-gnu/libgstreamer-1.0.so.0 [0x5e499]
=========     Host Frame:./deepstream-nvdsanalytics-test [0x3bca]
=========     Host Frame:/lib/x86_64-linux-gnu/libc.so.6 (__libc_start_main + 0xe7) [0x21b97]
=========     Host Frame:./deepstream-nvdsanalytics-test [0x249a]
=========
[NvDCF] Initialized
Deserialize yoloLayerV3 plugin: yolo_168
Deserialize yoloLayerV3 plugin: yolo_176

If I replace the same with KLT tracker, everything runs fine. Furthermore, I used heaptrack to track for memory leaks and it returns saying that there is 230 MB of memory leak in the deepstream-nvdsanalytics-test application. Is this a known issue? Should I trust the tool, or is it false positive?

• Hardware Platform (Jetson / GPU): T4
• DeepStream Version: 5
• TensorRT Version: 7
• NVIDIA GPU Driver Version (valid for GPU only): 440.33

I have reproed the cuda-memcheck initialization error, but need time to investigate.
about using heaptrack to track for memory leaks, i am not familiar with that tool, here is i used to check if memory leak exist, and some logs when running the script with deepstream-nvdsanalytics-test sample running parallelly. from the log, after stable, it will keep at 2271204, there no memory leak for the 2 hours running, you may run more times to see how it going.

dump_RSS_mem_runningtime_pid.sh.log (365 Bytes)

nvidia@tegra-ubuntu:/opt/nvidia/deepstream/deepstream-5.0$ sudo ~/dump_RSS_mem_runningtime_pid.sh 13509
Getting RSS details for process
2271204, 00:14
Getting RSS details for process
2365908, 01:14
Getting RSS details for process
2367796, 02:14
Getting RSS details for process
2368060, 03:14
Getting RSS details for process
2368060, 04:14
Getting RSS details for process
2368060, 05:14
Getting RSS details for process
2368060, 06:14
Getting RSS details for process
2368060, 07:15
Getting RSS details for process
2368324, 08:15
Getting RSS details for process
2368324, 09:15
Getting RSS details for process
2368324, 10:15
Getting RSS details for process
2369020, 11:15
Getting RSS details for process
2369044, 12:15
Getting RSS details for process
2369060, 13:15
Getting RSS details for process
2369072, 14:16
Getting RSS details for process
2369076, 15:16
Getting RSS details for process
2369108, 16:16
Getting RSS details for process
2369144, 17:16
Getting RSS details for process
2369180, 18:16
Getting RSS details for process
2369224, 19:16
Getting RSS details for process
2369252, 20:16
Getting RSS details for process
2369272, 21:17
Getting RSS details for process
2369304, 22:17
Getting RSS details for process
2369320, 23:17
Getting RSS details for process
2369328, 24:17
Getting RSS details for process
2369684, 25:17
Getting RSS details for process
2369872, 26:17
Getting RSS details for process
2371720, 27:18
Getting RSS details for process
2371720, 28:18
Getting RSS details for process
2371720, 29:18
Getting RSS details for process
2371720, 30:18
Getting RSS details for process
2371720, 31:18
Getting RSS details for process
2371720, 32:18
Getting RSS details for process
2371720, 33:18
Getting RSS details for process
2371720, 34:19
Getting RSS details for process
2371720, 35:19
Getting RSS details for process
2371720, 36:19
Getting RSS details for process
2371720, 37:19
Getting RSS details for process
2371720, 38:19
Getting RSS details for process
2371720, 39:19
Getting RSS details for process
2371720, 40:19
Getting RSS details for process
2371720, 41:20
Getting RSS details for process
2371720, 42:20
Getting RSS details for process
2371720, 43:20
Getting RSS details for process
2371720, 44:20
Getting RSS details for process
2371720, 45:20
Getting RSS details for process
2371720, 46:20
Getting RSS details for process
2371720, 47:20
Getting RSS details for process
2371720, 48:21
Getting RSS details for process
2371720, 49:21
Getting RSS details for process
2371720, 50:21
Getting RSS details for process
2371720, 51:21
Getting RSS details for process
2371720, 52:21
Getting RSS details for process
2371720, 53:21
Getting RSS details for process
2371720, 54:22
Getting RSS details for process
2371720, 55:22
Getting RSS details for process
2371720, 56:22
Getting RSS details for process
2371720, 57:22
Getting RSS details for process
2371720, 58:22
Getting RSS details for process
2371720, 59:22
Getting RSS details for process
2371720, 01:00:22
Getting RSS details for process
2371720, 01:01:23
Getting RSS details for process
2371720, 01:02:23
Getting RSS details for process
2371720, 01:03:23
Getting RSS details for process
2371720, 01:04:23
Getting RSS details for process
2371720, 01:05:23
Getting RSS details for process
2371720, 01:06:23
Getting RSS details for process
2371720, 01:07:24
Getting RSS details for process
2371720, 01:08:24
Getting RSS details for process
2371720, 01:09:24
Getting RSS details for process
2371720, 01:10:24
Getting RSS details for process
2371720, 01:11:24
Getting RSS details for process
2371720, 01:12:24
Getting RSS details for process
2371720, 01:13:24
Getting RSS details for process
2371720, 01:14:25
Getting RSS details for process
2371720, 01:15:25
Getting RSS details for process
2371720, 01:16:25
Getting RSS details for process
2371720, 01:17:25
Getting RSS details for process
2371720, 01:18:25
Getting RSS details for process
2371720, 01:19:25
Getting RSS details for process
2371720, 01:20:26
Getting RSS details for process
2371720, 01:21:26
Getting RSS details for process
2371720, 01:22:26
Getting RSS details for process
2371720, 01:23:26
Getting RSS details for process
2371720, 01:24:26
Getting RSS details for process
2371720, 01:25:26
Getting RSS details for process
2371720, 01:26:26
Getting RSS details for process
2371720, 01:27:27
Getting RSS details for process
2371720, 01:28:27
Getting RSS details for process
2371720, 01:29:27
Getting RSS details for process
2371720, 01:30:27
Getting RSS details for process
2371720, 01:31:27
Getting RSS details for process
2371720, 01:32:27
Getting RSS details for process
2371720, 01:33:28
Getting RSS details for process
2371720, 01:34:28
Getting RSS details for process
2371720, 01:35:28
Getting RSS details for process
2371720, 01:36:28
Getting RSS details for process
2371720, 01:37:28
Getting RSS details for process
2371720, 01:38:28
Getting RSS details for process
2371720, 01:39:28
Getting RSS details for process
2371720, 01:40:29
Getting RSS details for process
2371720, 01:41:29
Getting RSS details for process
2371720, 01:42:29
Getting RSS details for process
2371720, 01:43:29
Getting RSS details for process
2371720, 01:44:29
Getting RSS details for process
2371720, 01:45:29
Getting RSS details for process
2371720, 01:46:29
Getting RSS details for process
2371720, 01:47:30
Getting RSS details for process
2371720, 01:48:30
Getting RSS details for process
2371720, 01:49:30
Getting RSS details for process
2371720, 01:50:30
Getting RSS details for process
2371720, 01:51:30
Getting RSS details for process
2371720, 01:52:30
Getting RSS details for process
2371720, 01:53:30
Getting RSS details for process
2371720, 01:54:31
Getting RSS details for process
2371720, 01:55:31
Getting RSS details for process
2371720, 01:56:31
Getting RSS details for process
2371720, 01:57:31
Getting RSS details for process
2371720, 01:58:31
Getting RSS details for process
2371720, 01:59:31
Getting RSS details for process
2371720, 02:00:32
Getting RSS details for process
2371720, 02:01:32
Getting RSS details for process
2371720, 02:02:32
Getting RSS details for process
2371720, 02:03:32
Getting RSS details for process
2371720, 02:04:32
Getting RSS details for process
2371720, 02:05:32
Getting RSS details for process
2371720, 02:06:32
Getting RSS details for process
2371720, 02:07:33
Getting RSS details for process
2371720, 02:08:33
Getting RSS details for process
2371720, 02:09:33
Getting RSS details for process
2371720, 02:10:33
Getting RSS details for process
2371720, 02:11:33
Getting RSS details for process
2371720, 02:12:33
Getting RSS details for process
2371720, 02:13:33
Getting RSS details for process
2371720, 02:14:34
Getting RSS details for process
2371720, 02:15:34
Getting RSS details for process
2371720, 02:16:34
Getting RSS details for process
2371720, 02:17:34
Getting RSS details for process
2371720, 02:18:34
Getting RSS details for process
2371720, 02:19:34
Getting RSS details for process
2371720, 02:20:35
Getting RSS details for process
2371720, 02:21:35
Getting RSS details for process
2371720, 02:22:35
Getting RSS details for process
2371720, 02:23:35
Getting RSS details for process
2371720, 02:24:35
Getting RSS details for process
2371720, 02:25:35
Getting RSS details for process
2371720, 02:26:35
Getting RSS details for process
2371720, 02:27:36
Getting RSS details for process
cat: /proc/13509/status: No such file or directory

Besides, cuda-memcheck used to crash when application become heavy, you can try with valgrind but it won’t show cuda memory leak, and did you want to get cuda mem leak or CPU mem leak?