Use TensorRT model with TAO Toolkit inference

For inference, usually there are 3 ways.

  1. Tao inference. Currently it can only run against tlt model.

  2. With deepstream. Refer to
    Issue with image classification tutorial and testing with deepstream-app - #21 by Morganh

  3. With python inference. Refer to tao-toolkit-triton-apps/configuring_the_client.md at main · NVIDIA-AI-IOT/tao-toolkit-triton-apps (github.com) and Issue with image classification tutorial and testing with deepstream-app - #25 by dzmitry.babrovich