Sampes to run a tlt based resnet18 classification model tensorrt engine file

I am trying to run a resnet18 classification model .trt engine file using a stand alone python script. When I load and run the model, the performance is quite different from the .tlt based tlt-infer. I have optimized and generated .trt engines for fp32, fp16 and int8 precisions, but for all these trt engines I am getting different results.

Can someone suggest a sample blog and a way to understand this model behaviours

The same topic as Inferring resnet18 classification etlt model with python

1 Like