Converting TensorFlow model to tensorrt taking so long

Hi,

I am using GitHub from GitHub - theAIGuysCode/tensorflow-yolov4-tflite: YOLOv4, YOLOv4-tiny, YOLOv3, YOLOv3-tiny Implemented in Tensorflow 2.0, Android. Convert YOLO v4 .weights tensorflow, tensorrt and tflite

i want to try the tensorRT model since the normal model has long delay and bad fps.

when i convert the model to TensorRT framework, it tooks about 4 hour and it doesnt even finish correctly. The last message from runinng this command

python convert_trt.py --weights ./checkpoints/yolov4.tf --quantize_mode float16 --output ./checkpoints/yolov4-trt-fp16-416

it’s “done converting tf trt”, but it doesnt finished there for 2 hours until i just stopped it manually.

I am using:
-Jetson Xavier NX Dev Kit
-TensorFlow 2.2.0
-TensorRT 7.1.3

is there something that i’m doing wrong?

Hi,

Sorry that we are not familiar with the convert_trt.py implementation.
But have you tried to run the comment with python3?
Since our TensorFlow package is only available with python3.

Thanks.