Difference between CUDA container and CUDA toolkit

I am just starting to use a DGX station, and I am learning how to use docker containers. I notice CUDA drivers are already installed by default, but the CUDA container is not. I would like to know what are their differences, and do I need to run the CUDA container every time I want to access the GPUs by

docker run --gpus all nvidia/cuda:10.0-base nvidia-smi

For example, if I would like to run Tensorflow with GPUs, do I need to run this command first before running Tensorflow container?

Thank you very much!