Unable to convert TrafficCamNet pruned model with tlt-converter

Hello there, I have been trying to use the pruned version of TrafficCamNet that I downloaded from NGC. So, as per the official documentation, I attempted to convert the .etlt model with TLT converter. Initially, I assumed that the key that I have to provide would be the NGC API Key, but later from forums I got to know that the default key for pruned models is ‘nvidia_tlt’. However, both the inhouse DeepStream TRT conveter and also the TLT converter were unable to convert it.

Following is the neccessary command with the argument values:
./tlt-converter -k nvidia_tlt -o output_bbox/BiasAdd,output_cov/Sigmoid -d 3,544,960,0 -e
resnet18_trafficcamnet.engine resnet18_trafficcamnet_pruned.etlt

I have been getting this error output:
[ERROR] UffParser: Could not parse MetaGraph from /tmp/fileJby2ib
[ERROR] Failed to parse the model, please check the encoding key to make sure it’s correct
[ERROR] Network must have at least one output
[ERROR] Network validation failed.
[ERROR] Unable to create engine
Segmentation fault (core dumped)

Am I doing something wrong?

Your key is not correct.
See How to run tlt-converter

The unpruned and pruned models are encrypted and will only operate with the following key:

Model load key: tlt_encode

1 Like

Thanks a ton Morgan. It worked.