Where can I find a project where the GPU inside a mobile device has been optimised using TensorRT?


I would like to know if there are some projects where the GPU inside tablets or cellphones have been optimised using TensorRT.

Thank you

Hi @Aizzaac, I don’t believe there are tablets or cellphones on the market that have a CUDA-capable GPU, hence they would not be using TensorRT.

1 Like

But is it possible to do inference/prediction with those mobile devices? Maybe not optimised to be fast but at least can work?

I’m not familiar with the state of ML framework support (i.e. PyTorch, TensorFlow, ect) on Android/iOS, but in theory yes you could run the inference through the native frameworks that the models were trained in.

1 Like