Inference on TensorRT engine with trtexec


I’m using a publicly available computer vision model called UIU-Net. When I use the given code for inference on the pretrained model they share I get logical results. But when I convert this pretrained pytorch model to tensorrt the inference pipeline I want to build doesn’t work as expected and the results do not match at all.
In order to convert the pytorch model to tensorrt engine, I first convert it to an onnx model and the onnx model I got works as expected too, but converting this onnx model to tensorrt engine and running inference with “trtexec” doesnt work.
You can find my scripts and steps to reproduce down below. I believe my way of saving the output of “trtexec” and then converting this resultant json to an image is faulty but I’m open to any advice.
Fİnally keep in mind that my question is about creating an inference pipeline rather than something specific to this model. I believe you reproduce this with any CNN model on any image.

Thanks in advance


TensorRT Version: 8.5.2-1+cuda11.4
GPU Type: Jetson AGX Orin
Nvidia Driver Version: NVIDIA UNIX Open Kernel Module for aarch64 35.4.1
CUDA Version: 11.4
CUDNN Version: 8.6
Operating System + Version: 5.10.120-tegra - Ubuntu 20.04
Python Version (if applicable): 3.8.2
TensorFlow Version (if applicable):
PyTorch Version (if applicable): 2.1.0a0+41361538.nv23.06
Baremetal or Container (if container which image + tag):

Relevant Files (2.5 KB) Script I use for torch to onnx model conversion Run inference on the resultant onnx model for sanity check Converts .png image to binary .dat file for inference Converts the output .json file back to .png image

Steps To Reproduce

  • Download pretrained weights from here
  • Run
  • Download any .png image (8-bit, 512x512 preferred)
  • Run (Optional)
  • Run
  • Execute “trtexec --onnx=uiu-net.onnx --saveEngine=uiu-net-fp32.trt”
  • Execute “trtexec --loadEngine=uiu-net-fp32.trt --loadInputs=input.1:input_tensor.dat --exportOutput=frame1000.json”
  • Run

I am trying a repro from my end, and shall update you.