[MMAPI-28.1] V4L2 decoder memory leak - TX1

I have modified the 00_video_decode demo provided with MultiMedia API 28.1 to loop on the same video, cleaning up allocations and creating a new video_decoder at every iteration.

Both top and tegrastats show a leak of ~38MB virtual memory at each loop, which is equivalent to what was requested for the capture plane via VIDIOC_REQBUFS (10 * 4000000).

A short video looping hangs the TX1 within two minutes (swap, OOM).

It seems that VIDIOC_REQBUFS {.type=V4L2_MEMORY_MMAP; .count=0;} on /dev/nvhost-nvdec does not release current buffers on the FD.

This is a blocker for our resident application which uses the MMAPI for video decoding.

Hfrappier,
I think this is a known issue.
Could you pls wait for rel28.2 release?
This kind of issue will be fixed in this version.

Hi waynezhu,

I’m watching issues about MMAPI with interest to choice the version of MMAPI.
My application (Streaming Server with MMAPI R28.1/R28.2) becomes “core dump” for 100 or more stream transmitting.
Using another cosole window, tegrastats is running then using RAM is increasing and reaches the upper limit (8GB).
→ available memory decreases 5-130MB per stream (R28.2, Encoder/Converter, TX2)
I suspect that deinitPlane() / v4l2_ioctl(fd, VIDIOC_REQBUFS, &reqbufs:count=0) has any memory leak.

In the release-note there is no description about “Known Issue” and “Fixed Issue”.
Where is the list of known issue ?

Has this issue been fixed at R28.2 ?

Best Regards,

Hi mynaemi,
Could you have a try with rel28.2? This issue was fixed in Rel28.2.

Thanks
wayne zhu

Hi waynezhu,

Sorry to large latency of my reply.
I appreciate your comment.
It was very usefull.


But I have another issue with R28.2 H.264 decoder of TX2,
then I am still using R28.1 H.264 decoder of TX2.
[url]https://devtalk.nvidia.com/default/topic/1032037/jetson-tx2/-mmapi-r28-1-r28-1-to-reduce-dpb-delay-of-nvvideodecoder/post/5252265/#5252265[/url]

And I’m facing a memory leak issue about R28.2/R28.1 H.264 encoder of TX2.
(I have not yet post a topic about this issue.)

Best Regards