Unknown format in TREx


When converting a model to TRT with trtexec by specifying minShapes, optShapes and maxShapes, TREx fails to inspect precisions of all layers/tensors. However, when I convert without specifying dynamic shapes (even though ONNX file does so), TREx successfully display precisions.

Note that --exportLayerInfo flag in trtexec resturns unknown formats when converting with dynamic shapes.


NVIDIA Jetson Nano Developer Kit - Jetpack 4.6.4 [L4T 32.7.4]
TensorRT Version 8.2.1:
NVIDIA Corporation Device 0faf (rev a1):
Nvidia Driver Version:
CUDA Version 10.2.300:
Ubuntu 18.04.6 LTS:
Python 3.8.18:
R32 (release), REVISION: 7.4, GCID: 33514132, BOARD: t210ref, EABI: aarch64

Relevant Files

models.zip (4.9 MB)
Refer to models.zip file in the attachments section. There you will find backbone.onnx.

Steps To Reproduce

I try to convert backbone.onnx to tensorRT with the following command:

/usr/src/tensorrt/bin/trtexec --onnx=backbone.onnx --saveEngine=backbone.engine --device=0 --workspace=4000 --exportProfile=backbone_profile.json --exportLayerInfo=backbone_layerinfo.json --exportTimes=backbone_timing.json  --profilingVerbosity=detailed --verbose --noDataTransfers --useCudaGraph --separateProfileRun --useSpinWait --minShapes=img:1x1x128x128 --optShapes=img:4x1x256x256 --maxShapes=img:4x1x320x320

and then use this tutorial on TREx to visualize my engine. The created backbone.engine.svg file (see in models.zip) shows unknown format in the graph for all layers/tensors.

However, if I convert with

/usr/src/tensorrt/bin/trtexec --onnx=backbone.onnx --saveEngine=backbone.engine --device=0 --workspace=4000 --exportProfile=backbone_profile.json --exportLayerInfo=backbone_layerinfo.json --exportTimes=backbone_timing.json  --profilingVerbosity=detailed --verbose --noDataTransfers --useCudaGraph --separateProfileRun --useSpinWait

then the same tutorial shows FP32 precision instead of unknown format.

Hi @davit.papikyan ,
Can you please help us with the verbose logs.


Sure, here are the log files correspondingly for the two commands above:

Thank you!

@AakankshaS do you have any idea?