Hello,
We currently have RTX A5000 GPU with us, which has 8192 cuda cores. We were investigating purchasing a A100 GPU (80GB) for a NLP inference purpose.
Datasheet of A100 specifies only Tensor-cores and didn’t tell anything about CUDA cores. That means I has only Tensor cores. Is our understanding correct?
Also, is there any Nvidia document which describes what are the advantages of CUDA cores and Tensor-cores and which to choose when?