Convert . etlt to .plan model to deploy to triton server

Please provide the following information when requesting support.

• Hardware (T4/V100/Xavier/Nano/etc)
Ubuntu, x64, RTX3090
• Network Type (Detectnet_v2/Faster_rcnn/Yolo_v4/LPRnet/Mask_rcnn/Classification/etc)
• TLT Version (Please run “tlt info --verbose” and share “docker_tag” here)
• Training spec file(If have, please share here)
• How to reproduce the issue ? (This is for errors. Please share the command line and the detailed log here.)

I’m following tao-toolkit-triton-apps try to convert my custom .etlt model (retrained from detectnet-v2) to .plan model file.

I’m using the download tao-converter from link, but the file can’t be executed by error:

./tao-converter
bash: ./tao-converter: cannot execute binary file: Exec format error

could you help to check? and is this tool same as the command: tao converter ... in tao jupyter note book?

If you follow GitHub - NVIDIA-AI-IOT/tao-toolkit-triton-apps: Sample app code for deploying TAO Toolkit trained models to Triton, it is not needed to download tao-converter.

More, can you share the link you mentioned in " I’m using the download tao-converter from link" ?

i have already run a triton server which has the different version(21.12) from the sample, so prefer to do the model convert outside the docker, the tao-converter was downloaded from the link.

Can you share the link?
Do you download the correct version?

developer.nvidia.com/tao-converter

Please download from TensorRT — TAO Toolkit 3.21.11 documentation
Make sure download the correct version.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.