I am training my yolo_v3 model in TAO framework and
deploying it in Deepstream Framework.
After training the model, I am getting .hdf5 format.
There is a section to convert the .hdf5 model to .onnx format and then converting it to trt.engine which can be used in Deepstream
I cannot see any mention of .tlt or .etlt model formats anywhere.
how to generate or convert .tlt model or .etlt model ?
Is .tlt or .etlt model formats are still supported in latest TAO version.
I am using TAO - 5.0.0.
You can deploy .onnx model or trt.engine in Deepstream.
It is not needed to convert to .etlt model.
The .tlt is actually encrypted of hdf5 file. The .etlt is actually encrypted of onnx file. In TAO5.0 or later version, the source code is open, so .tlt or .etlt is deprecated.
I have trained my yolo_v3 model in TAO framework and exported that to the deepstream conatiner. Now I am getting below error when I am trying to run the deepstream app :
ERROR: yoloV3 output layer.size: 4 does not match mask.size: 3
0:00:02.780454259 337 0x5952eedbb000 ERROR nvinfer gstnvinfer.cpp:640:gst_nvinfer_logger:<primary_gie> NvDsInferContext[UID 1]: Error in NvDsInferContextImpl::fillDetectionOutput() <nvdsinfer_context_impl_output_parsing.cpp:726> [UID = 1]: Failed to parse bboxes using custom parse function
Segmentation fault (core dumped)