TLT-Converter on Jetson Nano error

Hello, I tried to run tlt-converter on Jetson nano but I got this error

[ERROR] UffParser: Could not open /tmp/filesY6zOT
[ERROR] Failed to parse uff model
[ERROR] Network must have at least one output
[ERROR] Unable to create engine
Segmentation fault (core dumped)

this is my log

./tlt-converter -k dWhrajZsbWtobW8wZ2UycmhnaDdqZmw3cGg6MWNhZGU2NTYtNjA5Yy00ZWQ0LTgxZTktYzE4ZmZkOWI4NWI1 -o output_cov/Sigmoid,output_bbox/BiasAdd -d 3,720,1280 -e /home/deepstream/Desktop/TA/resnet18_detector_baru.engine /home/deepstream/Desktop/TA/resnet18_detector_fp16.etlt

Do you have any idea?

Hi m.billson16,
Please paste below result here.
$ md5sum ./tlt-converter

Hello, Morganh, this is the result for md5sum ./tlt-converter

46de17fd216e0364a1fccead6d68707b ./tlt-converter

Please go through below two tickets and get the pointers.
https://devtalk.nvidia.com/default/topic/1066887/transfer-learning-toolkit/tlt-converter-error-uffparser-and-nbsp-nvdsinfer-error-nvdsinfer_custom_lib_failed-deepstream/post/5403416/#5403416
https://devtalk.nvidia.com/default/topic/1065680/transfer-learning-toolkit/tlt-converter-uff-parser-error/post/5397071/#5397071

  1. The $KEY is really set when you train the etlt model. Also make sure it is correct.
  2. The key is correct when you run tlt-converter. The key should be exactly the same as used in the TLT training phase
  3. The etlt model is available

Hello Morganh, I have solve my problem, it was because I put wrong API KEY.
Thank you so much for the help