[ERROR] UffParser: Could not parse MetaGraph from /tmp/fileXd5o6R
[ERROR] Failed to parse the model, please check the encoding key to make sure it’s correct
[ERROR] Network must have at least one output
[ERROR] Network validation failed.
[ERROR] Unable to create engine
Segmentation fault (core dumped)
I have built the TensorRT OSS on Jetson (ARM64)
as instucted to use with YOLO
but I cant get past errors
I forgot to mention I get the same errors when I run the tlt-converter on my ubunta x86 machine
from inside the Transfer Learning toolkit Docker container using the TLT YOLO example
I am still having issues when running the TLT-converter using a YOLO model retained with TLT v2. I cleared out all docker containers on my devlopment machine. Pulled TLT docker container. got a new key and then ran the YOLO example in the TLT jupyter notebook.the tlt-converter works on my devlopment machine but not on the NX. I know the Key is correct because the same key when used with detectnet and tlt-converter runs fine
~/tlt_7.1$ ./tlt-converter -k ajdqdnVicTU4Mm0wcGg0OWoyMDI0NmJrMTQ6NjE1OGViN2ItOGY2My00ZTMzLWE3OWYtYWZmODBjN2VhYjU2 -d 3,384,1248 -o BatchedNMS -e /home/nx/tlt_7.1/export/trt.engine -m 1 -t fp16 -i nchw /home/nx/tlt_7.1/export/yolo_resnet18_epoch_100.etlt
[ERROR] UffParser: Validator error: FirstDimTile_2: Unsupported operation _BatchTilePlugin_TRT
[ERROR] Failed to parse the model, please check the encoding key to make sure it’s correct
[ERROR] Network must have at least one output
[ERROR] Network validation failed.
[ERROR] Unable to create engine
Segmentation fault (core dumped)