TLT to TensorRT engine

I am trying to convert the peoplenet.tlt model to a TensorRT model so I can add it to a preexisting Python pipeline. My understanding is that I have to convert my .tlt file to a .etlt file with the tlt-export tool, and than use tlt-converter to convert to a TensorRT engine ?!
at this point, running the TLT docker image, I keep having error messages complaining about the encoding keys (I did register the API key earlier)

when using tlt-converter
[ERROR] Failed to parse the model, please check the encoding key to make sure it’s correct

when using tlt-export
OSError: Invalid decryption. Unable to open file (file signature not found). The key used to load the model is incorrect.

What am I missing ?

The key should be below according to https://ngc.nvidia.com/catalog/models/nvidia:tlt_peoplenet

  • Model load key: tlt_encode
2 Likes