I have a custom AI model which is doing Image classification. It is running fine in Jetson Nano B01 board. The issue I am facing is the image size is 512x512, it takes almost 3 secs to load and do the inferencing. I am using TFLite framework for loading the image, model file and doing inference.
I would like to know is there a better way to improve the inference time. I would like to process around 30 images in 6 secs timeframe.
Thank you in advance.