Failed to run tensorrt docker image on Jetson Nano

% sudo nvidia-docker run -it --rm nvcr.io/nvidia/tensorrt:20.03-py3
[sudo] password for ME:
docker: Error response from daemon: OCI runtime create failed: container_linux.go:345: starting container process caused "process_linux.go:430: container init caused \"process_linux.go:413: running prestart hook 1 caused \\\"error running hook: exit status 1, stdout: , stderr: exec command: [/usr/bin/nvidia-container-cli --load-kmods configure --ldconfig=@/sbin/ldconfig.real --device=all --compute --utility --video --require=cuda>=9.0 --pid=22211 /var/lib/docker/overlay2/d1e1dbbc8221597457a4a6b8b9396d4c99941a67b32202726c6f9b2e915ed5ef/merged]\\\\nnvidia-container-cli: mount error: mount operation failed: /usr/src/tensorrt: no such file or directory\\\\n\\\"\"": unknown.

Actually, I have /usr/src/tensorrt on Jetson Nano host.

% ls -ld /usr/src/tensorrt
drwxr-xr-x 5 root root 4096  4月 29 19:53 /usr/src/tensorrt

I’m using nv-jetson-nano-sd-card-image-r32.4.2 as OS image and nvidia-docker versions is:

% sudo nvidia-docker version
NVIDIA Docker: 2.0.3
Client:
 Version:           18.09.7
 API version:       1.39
 Go version:        go1.10.1
 Git commit:        2d0083d
 Built:             Fri Aug 16 14:20:24 2019
 OS/Arch:           linux/arm64
 Experimental:      false

Server:
 Engine:
  Version:          18.09.7
  API version:      1.39 (minimum version 1.12)
  Go version:       go1.10.1
  Git commit:       2d0083d
  Built:            Wed Aug 14 19:41:23 2019
  OS/Arch:          linux/arm64
  Experimental:     false

Any help? Thank you.

Hi,

Please noticed that not all the container in NGC can be used on the Jetson platform.
The container you shared is for desktop user.
Please check this for the image that supports Jetson:

Thanks

Oh, I see.

BTW, I wanted to run yolov3_onnx sample in docker on Jetson Nano and succeeded by the following commands.

On Jetson Nano host:

cp -r /usr/src/tensorrt/samples/python/ tensorrt-samples
cd tensorrt-samples
sudo nvidia-docker run -it --rm -v $PWD:/tensorrt-samples -v /usr/lib/python3.6/dist-packages/tensorrt:/usr/lib/python3.6/dist-packages/tensorrt nvcr.io/nvidia/l4t-base:r32.4.2 bash

In l4t-base container:

cd /tensorrt-samples/yolov3_onnx

apt update
apt install python3-pip cmake protobuf-compiler libprotoc-dev libopenblas-dev gfortran libjpeg8-dev libxslt1-dev libfreetype6-dev

pip3 install cython
pip3 install -r requirements.txt

python3 yolov3_to_onnx.py
python3 onnx_to_tensorrt.py

Thank you for your help.

Hi,

You can find a container with TensorRT installed.
In general, we recommend ‘nvcr.io/nvidia/deepstream-l4t:5.0-dp-20.04-base’ since it contains most of our SDK.

Thanks.