Hello all,
I am in a situation where I need to build a Docker image that includes the following:
- CUDA driver
- CUDNN
- NCCL
This base image will be used in developing other Docker containers on which we will be installing CUDA and non-CUDA enabled versions of Tensorflow. Our environment is heterogeneous and there is no guarantee that our pods/nodes will have CUDA capabilities. So, we want our ML Compute nodes to transparently take advantage of CUDA capabilities if the pod is running on CUDA-enabled hardware.
I am able to build this base Docker image correctly on my Linux machine that has a GPU. It was wondering if I should be able to build the Docker image on a machine that doesn’t have a GPU.
Thanks in advance,
Prashanth