Docker issue

Hi all,

I was testing nvidia-docker on TX2 with JetPack v4.3. My final goal is to implement Triton inference client on this device.
So what I did a small testing is this command below.

Command:

docker run --rm --gpus all nvcr.io/nvidia/tensorrtserver:19.10-py3 nvidia-smi

Output:

docker: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?.
See 'docker run --help'.

Is there any suggestions about this issue?

Thank you!

BR,
Chieh

Env Info

  • JetPack v4.3
  • TX2 device
  • ubuntu version: 18.04
  • python3 version: 3.6.9
  • Tensorflow version: 1.15
  • TensorRT version: 6.0.1.10
  • CUDA version: 10.0.326
  • cuDNN version: 7.6.3
  • docker version: Docker version 18.09.7, build 2d0083d

Hi,

The image doesn’t support TX2 environment.
Currently, we only support l4t-based container for the Jetson device.
https://ngc.nvidia.com/catalog/containers?orderBy=modifiedDESC&pageNumber=0&query=l4t&quickFilter=containers&filters=

Thanks.

Hi AastaLLL,

I see!! Thanks for your information!!

BR,
Chieh