Can trtexec command generate ops count for onnx model and final engine mode on Orin Eval box?

Hi, there:

I’m relatively new to Nvidia tools. I need to know how many ops for my original onnx model and for the final engine model on Orin platform. What’s the sample command option? I don’t see in the profile file. Anyway to generate it with trtexec? I’m interested in int8 model for now.

=== I’m running the command like this for now:
/usr/src/tensorrt/bin/trtexec --onnx=./mobilenetv2_224x224_N.onnx --saveEngine=mobilenetv2_224x224_batch2.engine --exportProfile=mobilenetv2_224x224.json --int8 --useDLACore=0 --shapes=data:2x3x224x224 --allowGPUFallback --useSpinWait --separateProfileRun > mobilenetv2_224x224.log

Hi, Please refer to the below links to perform inference in INT8


I don’t see it can produce number of flops inside the models. And this is the lib. I assume you have ready to use tool.


We can do the polygraphy inspect model <model> for both the ONNX model and TRT engine using the Polygraphy tool.
Please refer to TensorRT/tools/Polygraphy/polygraphy/tools at main · NVIDIA/TensorRT · GitHub for more details.

Thank you.