Running tflite models on Orin Nano

I was trying to run an int8 quantized tflite model on Orin Nano GPU, but realized that tflite does not support the GPU. I converted it using ONNX and tried to run using TensorRT, but got an error that only signed int8 is supported. Is there a way to run uint8 quantized tflite models on Orin Nano GPU? Are there other frameworks that I could try?
I am benchmarking EdgeTPU and OrinNano GPU, so I need the model to be exactly the same on both.

Hi,

TensorRT doesn’t support unsigned INT8 operation.
You can find the details below:

Does your model work on standard TensorFlow?
If yes, you can install it with the document below:

Thanks.

Thanks for your reply. Unfortunately, we are using a quantized model and therefore need TFLite. Do you suggest using ONNX runtime with CUDA?

Hi,

If your model can run with ONNXRuntime, you can give it a try.
The package built for Jetson can be found in the below link:

https://www.elinux.org/Jetson_Zoo#ONNX_Runtime

Thanks.

Thanks, that worked.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.