Please provide complete information as applicable to your setup.
• Hardware Platform (Jetson / GPU) agx orin
• DeepStream Version 6.2
• JetPack Version (valid for Jetson only) 5.1
• TensorRT Version 8.5.2
Hello, I know that deep stream config files only support .etlt files. I have a custom model which I converted successfully to .trt but is there a specific way to convert this model from .trt to .etlt which is supported by DS.
As far as I know, the conversion from etlt model file to TensorRT engine file is a one-way conversion, you cannot convert it back.
I have model.ONNX file is there a way to covert it to .etlt. Following NVIDIA documentation generates .trt file not .etlt
Why do you want to convert onnx model to etlt model? etlt model is a specially encrypted TAO model, and the encryption method is not public.
If you want to use onnx model in deepstream, you can use it directly in deepstream (refer option onnx-file in nvinfer config).
but won’t using TRT version of the model makes it more optimized? or i will get the same performance with the .ONNX file?
ONNX & TLT are formats of models, the TRT engine is created from the model, and the performance is evaluated on the engine file.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.