Tlt-converter command not found on jetson nano

I have installed tlt-converter, supporting packages, and updated paths according to https://docs.nvidia.com/metropolis/TLT/tlt-getting-started-guide/text/deploying_to_deepstream.html#instructions-for-jetson. I moved the .etlt file to tlt_model_repo folder. And when I tried to run the tlt-converter I am getting this error,

Tensorrt version in nano is 7…1

Please run
$ chmod +x tlt-converter

1 Like

I did it just now. @Morganh, Thank you
That is solved.

sudo ./tlt-converter /opt/nvidia/deepstream/deepstream-5.0/samples/trtis_model_repo/barnet_models/resnet18_detector.etlt -k MS00MWFkLWFkY2ItZjM1NTZiN2U4MmFjNmk4Mzl1ZW4zcXNqZHZqM3AwbmRoOThl -c /opt/nvidia/deepstream/deepstream-5.0/samples/trtis_model_repo/barnet_models/calibration.bin -o output_bbox/BiasAdd,output_cov/Sigmoid -d 3,1072,1920 -i nchw -m 64 -t int8 -e /tlt-converter /opt/nvidia/deepstream/deepstream-5.0/samples/trtis_model_repo/barnet_models/resnet18_detector.trt -b4
[ERROR] UffParser: Could not parse MetaGraph from /tmp/fileZ8ovo6
[ERROR] Failed to parse the model, please check the encoding key to make sure it’s correct
[ERROR] Network must have at least one output
[ERROR] Network validation failed.
[ERROR] Unable to create engine
Segmentation fault