TensorRT INT8 with CPU only

If I use TensorRT INT8 optimization for TensorFlow models, will I get enhanced inference speed if I am using only a CPU?

Thanks

Hello,

TensorRT is written to take advantage of specific hardware and associated software stack. The generated engine files are not portable across platforms, TensorRT versions, and are specific to the exact GPU model they were built on.

regards,
NVIDIA Enterprise Support