Inference Benchmarks - TensorRT Version ?

Hi,

The above link contains the inference latency and throughput numbers, can you please provide the TensorRT version used while recording these metrics.

Can you please add meta data(Tools versions etc. ) along with silicon and model details to these published numbers , it would be great help.

Cheers !

Hello, this page may contain more of the details you are looking for.