Cannot convert .etlt to .engine when trying to use FaceDetect

I am using deepstream7.1-multiarch with TensorRT 10. I am trying to integrate the FaceDetect model and I cannot find a way of converting that etrt file. I triend with TAO deploy, the TensorRT versions werent compativle. Also, the nvidia-tao-deploy python package is not installing

Actually the .etlt model of FaceDect(actually detectent_v2 network) is a .uff file instead of .onnx file. So, after conversion, it is uff file.
So, for detectnet_v2 network, you can not convert ngc’s etlt file to onnx file.
You can download the .tlt model(note: not .etlt model) . Then use TAO 5.0 docker to export .tlt file to .onnx file.

Please refer to Cannot use TAO Deploy in Jetson AGX Orin - #5 by Morganh.

There is no update from you for a period, assuming this is not an issue anymore. Hence we are closing this topic. If need further support, please open a new one. Thanks

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.