MLPerf Inference: NVIDIA Innovations Bring Leading Performance

Originally published at: https://developer.nvidia.com/blog/nvidia-mlperf-v05-ai-inference/

New TensorRT 6 Features Combine with Open-Source Plugins to Further Accelerate Inference Inference is where AI goes to work. Identifying diseases. Answering questions. Recommending products and services. The inference market is also diffuse, and will happen everywhere from the data center to edge to IoT devices across multiple use-cases including image, speech and recommender systems…