Tlt to python

Hello,
I am currtently working with tlt, expecting to export my model in python then cpp (with tensorRT).
The model I am using is Yolov3.

I am having trouble finding documentation about that ‘export’.
Is there any page/blogpost on that topic?
Thanks

The tool tlt-export will help you generate etlt model based on the tlt model. If int8 mode, cal.bin is also generated.
https://docs.nvidia.com/metropolis/TLT/tlt-getting-started-guide/index.html#exporting_models
The tool tlt-converter will help you generate tensorRT engine based on etlt model.
https://docs.nvidia.com/metropolis/TLT/tlt-getting-started-guide/index.html#gen_eng_tlt_converter

From what I read, those engine don’t work with Trt 7.1?

What is the “UI” you mention?

typo

It should work with Trt7.1

To be clear, I was thinking using engine on a code edited for yolov3 constructed with that subjet : https://developer.nvidia.com/blog/speeding-up-deep-learning-inference-using-tensorflow-onnx-and-tensorrt/

Your mentioned blog is not talking about TLT.
What I mentioned “It should work with Trt7.1” means that if you install TRT7.1 in your Jetson device, the tool tlt-converter can work.

Yep I was just mentioning the part about loading engine in trt, TLT gives me that engine

Refer to Import tlt model in python code