Custom Parser for converted model from TLT to trtengine

Please provide complete information as applicable to your setup.

• Hardware Platform (Jetson / GPU) T4
• DeepStream Version 6.
• JetPack Version (valid for Jetson only)
• TensorRT Version 11.4
• NVIDIA GPU Driver Version (valid for GPU only)

Can i get parser example for TlT models , we are using triton 6.01 to load models and inference

TAO(TLT) model parser = TAO model descryption + UFF/ONNX model parser

TAO model descryption is closed source
UFF/ONNX model parser is open source, you can find in TensorRT/parsers at main · NVIDIA/TensorRT · GitHub, but for this, I think you can just use the parser in TRT

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.