Can the Tegra chip series inside mobile devices the same as the chip inside the TX2?

Hi

Ok, I have just learnt that there are Nvidia GPUs in cellphones.
In order to optimise those, can I still use TensorRT or does TFLite work there?
Can I use CUDA too or OpenCL?
What is the difference from the chip inside the TX2?

Thank you

Hi @Aizzaac, I believe it has been some years since Tegra chip was deployed in mobile phones (before Tegra was CUDA-capable) - do you know which phone(s) you are referring to?

Ok. These 2 are a bit old:

  • Tablet: HTC [Nexus 9]
  • Mobile: LG G2 mini LTE

But in 2020, which are the cellphones that can be optimised/accelerated using TensorRT, CUDA/OpenCL, etc?
Can Nvidia embedded GPUs (the ones in cellphones) use TFLite to optimise inference?

Thank you

No, they can’t be supported.