Hi! I am currently trying to transfer the tensorrt model from the server to my computer, and I don’t know if the graphic card on my computer will support the inference process.
The reason is that the tensorrt model set up on the server has enabled the mix-precision(float16 and int8 mode). So I wonder what the minimum requirement of the graphic card is(Titan V on the server) that allows the mix-precision model to do the inference.
*TensorRT Version: 184.108.40.206
GPU Type: server: Titan V; PC:GTX1050
Nvidia Driver Version: 450.51.05
CUDA Version: 11.0
CUDNN Version: 8.0.4
Operating System + Version: Ubuntu 18.04
Python Version (if applicable): 3.7
TensorFlow Version (if applicable): 1.14
PyTorch Version (if applicable):
Baremetal or Container (if container which image + tag):