TAO inference on INT8 Image classififier

Please provide the following information when requesting support.

• Hardware (RTX3090)
• Network Type (Classification)
• TLT Version (latest)

Running the Image classification jupyter notebook inference is done on the trained and retrained model. I want to evaluate how my model performs after exporting it to INT8. When changing the Julyter notebok from

!tao classification inference -e $SPECS_DIR/classification_retrain_spec.cfg
-m $USER_EXPERIMENT_DIR/output_rertrain/weights/resnet_$EPOCH.tlt \

to

!tao classification inference -e $SPECS_DIR/classification_retrain_spec.cfg
-m $USER_EXPERIMENT_DIR/export/final_model.etlt \ (or the created final_model.trt

I’m getting this error: “Invalid model file extension. /workspace/tao-experiments/classification/export/final_model.tlt”

Is it possible to do inference on a etlt or trt model and if so, how?

You can run with deepstream Image Classification — TAO Toolkit 3.22.02 documentation
or triton-app GitHub - NVIDIA-AI-IOT/tao-toolkit-triton-apps: Sample app code for deploying TAO Toolkit trained models to Triton
or you can also search and find some topics in TAO forum. For example,

Inferring resnet18 classification etlt model with python - #40 by Morganh
Error while running inference, model generated through TLT using Opencv-Python - #3 by Morganh
TAO tensorRT model inferencing using python