Please provide the following info (check/uncheck the boxes after creating this topic): Software Version
DRIVE OS Linux 5.2.6
DRIVE OS Linux 5.2.6 and DriveWorks 4.0
DRIVE OS Linux 5.2.0
DRIVE OS Linux 5.2.0 and DriveWorks 3.5
NVIDIA DRIVE™ Software 10.0 (Linux)
NVIDIA DRIVE™ Software 9.0 (Linux)
other DRIVE OS version
other
I had onnx model and optimised the model using TensorRT optimisation tool.
Now I am planning to compare onnx and TensorRT optimised file.
How to compare?
Thanks @SivaRamaKrishnaNV
I am attaching the log file of profiling options and sharing the onnx as well as trt optimised model.
I just want to compare these two models in terms of accuracy or inference time, in any comparison means as possible. model2cnn.trt (4.0 MB) model2cnn.onnx (1.9 MB)
Dear @alksainath.medam,
We have included --dumpProfile option with trtexec to report per layer perf information which seems to be missing. Could you check if --dumpProfile option works on DRIVE SW 10.0 release.
Regarding accuracy and inference time comparison, we don’t have any tools/scripts for comparing ONNX vs TRT. You may have to write your scripts on own to compare the performance on a dataset.
Okay @SivaRamaKrishnaNV
Checked --dumpProfile option getting error as unknown argument.
Attaching the log file.
for TRT performance any code is there to check performance?