Faster inference in Jetson Nano B01 board

Hello,

I have a custom AI model which is doing Image classification. It is running fine in Jetson Nano B01 board. The issue I am facing is the image size is 512x512, it takes almost 3 secs to load and do the inferencing. I am using TFLite framework for loading the image, model file and doing inference.

I would like to know is there a better way to improve the inference time. I would like to process around 30 images in 6 secs timeframe.

Thank you in advance.

Hi,

It’s recommended to convert the model into TensorRT for acceleration.
Please find a sample below:

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.