Deepstream .etlt to .engine and tao toolkit

hi,

Recently, I have been experimenting a bit with model conversion in DeepStream. I understand that we use Triton and TAO Toolkit to perform this task. Recently, I have been questioning the role of TAO Toolkit in the conversion of .etlt to .engine, as I have seen that this transformation can be carried out automatically using a config file.

I have attempted to create a Docker without TAO Toolkit (I believe I succeeded as I do not see any instances or files referencing TAO Toolkit), and I have performed the conversion using the following file and successfully completed the conversion.

config_etlt.txt (1.5 KB)

Can I use only Triton Inference Server and TensorRT for this task without the Tao Toolkit?