Hi all,
I was testing nvidia-docker on TX2 with JetPack v4.3. My final goal is to implement Triton inference client on this device.
So what I did a small testing is this command below.
Command:
docker run --rm --gpus all nvcr.io/nvidia/tensorrtserver:19.10-py3 nvidia-smi
Output:
docker: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?.
See 'docker run --help'.
Is there any suggestions about this issue?
Thank you!
BR,
Chieh
Env Info
- JetPack v4.3
- TX2 device
- ubuntu version: 18.04
- python3 version: 3.6.9
- Tensorflow version: 1.15
- TensorRT version: 6.0.1.10
- CUDA version: 10.0.326
- cuDNN version: 7.6.3
- docker version: Docker version 18.09.7, build 2d0083d