Fast INT8 Inference for Autonomous Vehicles with TensorRT 3

Originally published at: Fast INT8 Inference for Autonomous Vehicles with TensorRT 3 | NVIDIA Technical Blog

Autonomous driving demands safety, and a high-performance computing solution to process sensor data with extreme accuracy. Researchers and developers creating deep neural networks (DNNs) for self driving must optimize their networks to ensure low-latency inference and energy efficiency. Thanks to a new Python API in NVIDIA TensorRT, this process just became easier. TensorRT optimizes trained…