tlt-converter not working on Jetson Nano with JetPack 4.3 Image

I was struggling to convert a model from .etlt trained from following the Transfer Learning Toolkit Jypiter Notebook, for the last couple of days. I was searching everywhere to find my problem, and I just couldn’t make it work.

I was able to get tlt-converter to work when i downgraded my JetPack Version to 4.2.3 with TensorRT 5 installed.

Hopefully this helps anybody strugling, or helps to get tlt-converter to work with TensorRT 6

Dear NVIDIA TLT-Team,

Please provide a tlt-converter version that is TensorRT 6 based to make sure that it works on JetPack 4.3 which is based on TensorRT 6 and therefore does not provide the TensorRT 5 libraries necessary to execute the current version of the tlt-converter.
I like to use the engines with Deepstream 4.0.2 (TensorRT 6).

Thanks

Using the following parameters within the nvinfer configuration file of Deepstream 4.0.2 applications replaces the tlt-converter, because the tlt encoded model gets automatically transformed into a TensorRT engine.

Just Example-Values:

tlt-encoded-model=./resnet10_detector.etlt
tlt-model-key=#### (key removed by intention)
uff-input-dims=3;320;480;0
uff-input-blob-name=input_1

If there is any TLF issue, please open a topic into TLT forum: https://devtalk.nvidia.com/default/board/417/transfer-learning-toolkit/