I found the TensorRT docker image on NGC for v21.12-py3 which can support for 2 platforms (amd and arm). So I was trying to pull it on my AGX device.
If I docker run with gpus, then it will get failure.
nvidia@AGX-00044bcc0f04:~$ docker run --gpus all -it nvcr.io/nvidia/tensorrt:21.12-py3 bash docker: Error response from daemon: OCI runtime create failed: container_linux.go:380: starting container process caused: process_linux.go:545: container init caused: Running hook #0:: error running hook: exit status 1, stdout: , stderr: exec command: [/usr/bin/nvidia-container-cli --load-kmods configure --ldconfig=@/sbin/ldconfig.real --device=all --compute --utility --video --require=cuda>=9.0 --pid=29265 /var/lib/docker/overlay2/d3135262cb32659066b189443a9169ff227807300914a0bcee15be2cc2c0d6dc/merged] nvidia-container-cli: mount error: mount operation failed: /usr/src/tensorrt: no such file or directory: unknown. ERRO error waiting for container: context canceled
If I docker run without gpus, then it can access inside the container. (But obviously it cannot use TensorRT without GPU.)
nvidia@AGX-00044bcc0f04:~$ docker run -it nvcr.io/nvidia/tensorrt:21.12-py3 bash ===================== == NVIDIA TensorRT == ===================== NVIDIA Release 21.12 (build 29870938) NVIDIA TensorRT Version 8.2.1 Copyright (c) 2016-2021, NVIDIA CORPORATION & AFFILIATES. All rights reserved. Container image Copyright (c) 2021, NVIDIA CORPORATION & AFFILIATES. All rights reserved. https://developer.nvidia.com/tensorrt Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES. All rights reserved. This container image and its contents are governed by the NVIDIA Deep Learning Container License. By pulling and using the container, you accept the terms and conditions of this license: https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license To install Python sample dependencies, run /opt/tensorrt/python/python_setup.sh To install the open-source samples corresponding to this TensorRT release version run /opt/tensorrt/install_opensource.sh. To build the open source parsers, plugins, and samples for current top-of-tree on master or a different branch, run /opt/tensorrt/install_opensource.sh -b <branch> See https://github.com/NVIDIA/TensorRT for more information. WARNING: The NVIDIA Driver was not detected. GPU functionality will not be available. Use the NVIDIA Container Toolkit to start this container with GPU support; see https://docs.nvidia.com/datacenter/cloud-native/ .
Is there any information about it?
- Device : AGX
- JetPack v4.6
Thank you so much!!