I update my TX2 to use tensorTR 2.1
I want to test Int8 inference imporvement
but when I run the sampleInt8
and I got some error I never see
Int8 support requested on hardware without native Int8 support, performance will be negatively affected.
ERROR LAUNCHING INT8-to-INT8 GEMM: 8