How to deploy etlt file in nvidia-taoToolkit v5

• Network Type Detectnet_v2

With the new nvidia tao toolkit v5.0.0 when retraining a tlt model the output model is onnx, when trying to change to an eltl as in previous versions the exported model was exported as etlt.onnx. How can I do or to downgrade the version of the TaoToolkit or what can I do in the newer toolkit to obtain the etlt as the final model.

If you are going to obtain the etlt model, you can run with previous docker.
$ docker run --runtime=nvidia -it --rm nvcr.io/nvidia/tao/tao-toolkit:4.0.1-tf1.15.5 /bin/bash

When I tried to run the docker I got the following error:

docker: Error response from daemon: Unknown runtime specified nvidia.

Please $ sudo apt-get install -y nvidia-docker2.

I managed to run Docker on the EC2 instance and then imported the notebooks into a virtual environment within Docker. However, I’m currently facing an issue where I can’t access the notebooks via the browser as I could previously.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.