I am experiencing a seemingly unbound growth of memory consumption using
dwSensorSerializer_serializeImageAsync to record videos in MP4 or H265 format on a DRIVE AGX. I can reproduce the behavior using the
sample_camera_gmsl binary that ships with the DRIVE AGX. For example, this would be the output of tegrastats after starting the program
RAM 4246/27924MB (lfb 5663x4MB) CPU [20%,17%,8%,8%,11%,6%] GR3D_FREQ 28% AUX@38C CPU@40C Tdiode@38.25C AO@39.5C GPU@39C email@example.comC
and this after a few hours of recording
RAM 5957/27924MB (lfb 166x4MB) CPU [15%,12%,9%,9%,7%,5%] GR3D_FREQ 52% AUX@40C CPU@42.5C Tdiode@40C AO@41C GPU@41C firstname.lastname@example.orgC
It doesn’t look to me like the increase of allocated memory slows down or stalls over time. While debugging this behavior I have changed the
file-buffer-size that is passed to
dwSensorSerializer_intialize and it looks to me like the memory consumption increases every time data is written to the file. This means, with a very large
file-buffer-size the increase in memory consumption happens at a much lower rate but it continues to occur (and it does occur when the file size increases). Given that I can reproduce this behavior with an official sample application, I would think a misuse of of the Driveworks API is unlikely.
I haven’t been able to understand the root cause of this behavior and would like to know (1) if anyone else has experienced this, (2) if there is a way to prevent increasing memory allocation over time.
Hardware Platform: [Example: DRIVE AGX Xavier™ Developer Kit]
Software Version: [DRIVE Software 10]
Host Machine Version: [native Ubuntu 18.04]
SDK Manager Version: [126.96.36.19938]