Mem and GPU Mem estimation

I have Yolov8 model run on tensorrt 8. My model has a capacity of 2.2MB. GPU MemOps Avg 3.6MB HtoD, mem set and DtoH 0.028+0.028MB but I saw on jtop monitoring, It’s show about 600MB memory and 800MB GPU memory. How to know or estimate 600MB mems and 800MB mems. Have any tool to show this detail or have any way to understand this point?
Many Thanks!!!

Hi, @phamthanhdat270198

Please check if “Memory Workload Analysis” section in Nsight Compute can meet your demand.

I also have Jetson Orin Nano. But nvidia does not support Nsight for this platform. So how can I use Nsight or another tool to profile my model. Thanks

Hi, @phamthanhdat270198

Do you mean you can’t open Nsight Compute GUI in Jetson Orin Nano ?
You can use the command line tool “ncu” to generate the report and open the report in another device.

I saw this

Supported Platforms

Depending on your OS, different GPUs are supported

L4T (Linux for Tegra)

  • Jetson AGX Orin
  • Jetson AGX Xavier
  • Jetson TX2
  • Jetson TX2i
  • Jetson TX
  • Jetson Nano
  • Jetson Xavier NX

But you mean ncu ~ nsight compute cli. I just know nsys to profile model

Hi, @phamthanhdat270198

As you are collecting data using jtop and want to analyze that in more detail. For this Nsys may be better suited.

But nsys not support jetson orin nano. That reason why i ask you about tool like nsys. to profile the information about time and memory like nsys do

But nsys not support jetson orin nano
---------I am not sure if this is correct. Move to “Nsight System” forum to get support.

Yes, we support Orin, we just had not updated the supported platforms. Please get the correct version of Jetpack for your Orin.