Incompatible with expected resource error on transferring Tensorflow model to TensorRT

Environment

TensorRT Version :
8.0.1
CUDA Version :
10.2
Operating System + Version :
Jetson NX (jetpack 4.6)

TensorFlow Version (if applicable) :
2.5.0

I followed the steps on this repo:

and tried the following steps:

python3 save_model.py --weights ./data/custom.weights --output ./checkpoints/custom.tf --input_size 416 --model yolov4
python3 convert_trt.py --weights ./checkpoints/custom.tf --quantize_mode float16 --output ./checkpoints/custom-trt-fp16-416

but got the error:

ValueError: Input 0 of node StatefulPartitionedCall/model/batch_normalization/AssignNewValue was passed float from Func/StatefulPartitionedCall/input/_4:0 incompatible with expected resource.

Any idea on this issue?

Hi,
We recommend you to check the below samples links in case of tf-trt integration issues.

If issue persist, We recommend you to reach out to Tensorflow forum.
Thanks!