Is it necessary to install cuDNN after installing the Cuda Toolkit?
Eu sou usuário de Linux. My Python version is 3.12.0. My video card is the RTX 3070Ti. I finished installing Cuda 12.2 using this step by step guide *LINK. Is it necessary to install cuDNN as well? What about TensorRT? Or do I just need to follow this step by step *LINK?
If you only need to use CUDA, its not necessary. But if you want to use Tensorflow, Pytorch, and/or many other Deep Learning (DL) frameworks, you need to install cuDNN also. cuDNN is not included in the CUDA toolkit install. Furthermore, most major DL frameworks work with cuDNN, not purely/directly with CUDA.
TensorRT is used to accelerate DL inference. It is also not installed by the CUDA toolkit installer. If you are using an inference method that uses TensorRT, then you will need to install that separately from the CUDA toolkit install.
I will be using TensorFlow, so I need to install cuDNN. Could you give me a link with a step-by-step guide to installing it? I am using anaconda and jupyter notebook.