With the new nvidia tao toolkit v5.0.0 when retraining a tlt model the output model is onnx, when trying to change to an eltl as in previous versions the exported model was exported as etlt.onnx. How can I do or to downgrade the version of the TaoToolkit or what can I do in the newer toolkit to obtain the etlt as the final model.
I managed to run Docker on the EC2 instance and then imported the notebooks into a virtual environment within Docker. However, I’m currently facing an issue where I can’t access the notebooks via the browser as I could previously.