@spolisetty @NVES I tried this repo to convert to onnx model.
It takes by default 64 batchsize:
So, it take 1.5 - 2 hours to built the engine and inference time is almost 0.7 fps (way too slow than expected.)
Normally it should take batch size 1. but it is not working properly.
I tried to change the batch size to 1 in the config file and generated onnx model.
Then used trtexec to create an engine.
Then I used my script given above to do inference.
But I get smaller boxes and too many boxes (labels and probability is correct but only the box dimensions are not correct). Boxes are not around the object exactly instead they are small and tiny somewhere on the input image.
Is there something wrong the with calculation part ?? or engine creation part ?? or onnx conversion with batch size 1.
Could you provide me the command to be used to create engine with trtexec and required onnx conversion script with batch size 1.