Getting memory usage from trtexec output on Jetson

Hi all,

I’m trying to use trtexec to profile the memory usage of a TensorRT engine on a Jetson board (with TensorRT version 8.5.2).

I’m using this command for running trtexec with the engine:

/usr/src/tensorrt/bin/trtexec --loadEngine=model.dnn --verbose

And then seeing the following memory-related output:

03/09/2023-11:16:34] [I] [TRT] Loaded engine size: 581 MiB
[03/09/2023-11:16:35] [V] [TRT] Deserialization required 282096 microseconds.
[03/09/2023-11:16:35] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +580, now: CPU 0, GPU 580 (MiB)
[03/09/2023-11:16:35] [I] Engine deserialized in 0.803108 sec.
[03/09/2023-11:16:35] [V] [TRT] Total per-runner device persistent memory is 2048
[03/09/2023-11:16:35] [V] [TRT] Total per-runner host persistent memory is 3232
[03/09/2023-11:16:35] [V] [TRT] Allocated activation device memory of size 6324736
[03/09/2023-11:16:36] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in IExecutionContext creation: CPU +0, GPU +6, now: CPU 0, GPU 586 (MiB)

I’m wondering how to interpret all these values, and if any of them are indicative of the actual memory footprint that the model will use when loaded in C++ and used in production (I’m guessing the 586MiB, but want to make sure). Another method I tried is running tegrastats both before and during the trtexec session and then taking the difference between the memory usages, but of course this method includes the overhead used by TensorRT and gives much higher values than one would expect for just the model.

Thanks in advance

Hi,

[MemUsageChange] TensorRT-managed allocation in engine deserialization: CPU +0, GPU +580, now: CPU 0, GPU 580 (MiB)

It shows the newly allocated buffer for CPU/GPU and the accumulated CPU/GPU buffer amount.
Based on your log, 586 MiB device memory is used for loading the model and creating the CUDA context.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.