igpu (Tegra) on px2 support for tensorrt int8 (Tensorrt 3.0.2)

Hi,

I have installed the latest version driveworks (5.0.5.0a (February 12, 2018)). The Tensorrt 3.0.2 works fine with dgpu (CUDA_VISIBLE_DEVICES=0), but showing “hardware without native INT8 support” with igpu running (CUDA_VISIBLE_DEVICES=1).My code runs correctly with igpu but the speed is too slow with 48ms/image on igpu comparing 6.6ms/image on dgpu.

I am wondering if the speed issue is due to igpu hasn’t suppport tensorrt3.0.2 int8 ? or is there any other issues out there?

Thank,
Lei

Dear lshi,
iGPU supports FP32,FP16 and dGPU supports FP32,INT8.

Hi Siva,

Thanks for replying to me. I am wondering if there is a plan to let iGPU to support Int8 in the future tensorRT release? If so, when do you think it will happen?

Thanks,
Lei

Hi lshi,
TensorRT does not support it as there is no native hardware support on iGPU.

@SivaRamaKrishna Are there any samples for int8 inference using a .uff file on PX2?