Deepstream Docker ERROR: failed to solve: runtime not found

Please provide complete information as applicable to your setup.

• Hardware Platform: (Jetson)
• DeepStream Version: 6.2
• JetPack Version (valid for Jetson only): 5.1.2
• TensorRT Version: r8.5.2.2-runtime
• Issue Type: error

I’m trying to build the deepstream container as described in this NVIDIA guide.

I want to build it in an x86/64 Ubuntu and after building it, convert it to a .tar image and move it into a Jetson Nano and run it there. By starting the build using the command docker build --network=host -t deepstream_image:jetson I get the error below:

 => => transferring dockerfile: 5.22kB                                                                                                                                                0.0s
 => [internal] load .dockerignore                                                                                                                                                     0.0s
 => => transferring context: 2B                                                                                                                                                       0.0s
 => ERROR [internal] load metadata for nvcr.io/nvidia/l4t-tensorrt:r8.5.2.2-runtime                                                                                                   4.0s
 => [auth] nvidia/l4t-tensorrt:pull,push token for nvcr.io                                                                                                                            0.0s
------
 > [internal] load metadata for nvcr.io/nvidia/l4t-tensorrt:r8.5.2.2-runtime:
------
Jetson_Dockerfile_Base:16
--------------------
  14 |     # Use L4T tensorrt docker listed on https://catalog.ngc.nvidia.com/orgs/nvidia/containers/l4t-tensorrt/tags
  15 |     # Use r8.5.2.2 for DS 6.2.0
  16 | >>> FROM nvcr.io/nvidia/l4t-tensorrt:r8.5.2.2-runtime
  17 |     
  18 |     #Install vpi-dev and vpi-lib
--------------------
ERROR: failed to solve: nvcr.io/nvidia/l4t-tensorrt:r8.5.2.2-runtime: nvcr.io/nvidia/l4t-tensorrt:r8.5.2.2-runtime: not found

Why does this happen and how to fix it?

I think it’s a documentation error.

you can try use l4t-tensorrt:r8.5.2-runtime.

Here is tensorrt for l4t images.

Another option, Docker for DeepStream is open source, this is link.

Thanks, I did what you suggested, but another error happened:

 => [internal] load .dockerignore                                                                                                                                                     0.1s
 => => transferring context: 2B                                                                                                                                                       0.0s
 => [internal] load build definition from Jetson_Dockerfile_Base                                                                                                                      0.0s
 => => transferring dockerfile: 5.29kB                                                                                                                                                0.0s
 => [internal] load metadata for nvcr.io/nvidia/l4t-tensorrt:r8.5.2-runtime                                                                                                           1.8s
 => CACHED [ 1/28] FROM nvcr.io/nvidia/l4t-tensorrt:r8.5.2-runtime@sha256:17cd2bcce55cf42018c6d76ba71bab52387b145e015e4d2aa2747b1763c82246                                            0.0s
 => CANCELED [internal] load build context                                                                                                                                            0.2s
 => => transferring context: 59.32MB                                                                                                                                                  0.2s
 => ERROR [ 2/28] RUN apt-get update &&         DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         libnvvpi2 vpi2-dev vpi2-samples &&         rm -rf   0.2s
------
 > [ 2/28] RUN apt-get update &&         DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         libnvvpi2 vpi2-dev vpi2-samples &&         rm -rf /var/lib/apt/lists/* &&         apt autoremove:
#0 0.141 exec /bin/sh: exec format error
------
Jetson_Dockerfile_Base:20
--------------------
  19 |     #Install vpi-dev and vpi-lib
  20 | >>> RUN apt-get update && \
  21 | >>>         DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
  22 | >>>         libnvvpi2 vpi2-dev vpi2-samples && \
  23 | >>>         rm -rf /var/lib/apt/lists/* && \
  24 | >>>         apt autoremove
  25 |     
--------------------
ERROR: failed to solve: process "/bin/sh -c apt-get update &&         DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends         libnvvpi2 vpi2-dev vpi2-samples &&         rm -rf /var/lib/apt/lists/* &&         apt autoremove" did not complete successfully: exit code: 1

It seems there’s a problem with libnvvpi2 and it doesn’t exist in the apt repository.
I’m using an x86/64 Ubuntu 20.04 on a desktop machine and I want to build the Jetson Docker on it and then transfer the image to a real Jetson Nano device. Is the problem related to this or am I simply missing an apt repository that contains libnvvpi2?

I’ve tried to reproduce the issue and I think it’s a documentation bug.
Sorry for the trouble.

You can try build image through repo in github

(1) The (TensorRT image) updated the image version after release. They did some changes on how they version images.
(2) For the VPI install you need to be more explicitly state which VPI version you need.
apt-cache policy vpi2-dev
apt-cache policy libnvvpi2
apt-cache policy vpi2-samples

The format of the install would be like this and match the VPI version (should be version 2.2) included in the corresponding(matching) JetPack version 5.1 GA (Index) that was required for DS 6.2

In the you might have to do and install like with version being 2.2.

libnvvpi2=${version}
vpi2-dev=${version}
vpi2-samples=${version}

Alternatively, you may have to explicitly add the VPI *.deb files (from JetPack 5.1) inside the Dockerfile. This would be added in the jetson directory.

ADD vpi-dev-2.2-aarch64-l4t.deb /root
ADD vpi-lib-2.2-aarch64-l4t.deb /root

RUN  apt-get update && \
       DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
       /root/vpi-dev-2.2-aarch64-l4t.deb \
       /root/vpi-lib-2.2-aarch64-l4t.deb && \
       rm -rf /var/lib/apt/lists/* && \
       rm -f /root/vpi-dev-2.2-aarch64-l4t.deb /root/vpi-lib-2.2-aarch64-l4t.deb && \
       apt autoremove

Steps to Resolve the DeepStream Docker Error

Follow these steps to resolve the issue and successfully build your DeepStream container:

1. Verify the Image Version

First, ensure you are using the correct and latest version of the TensorRT image. NVIDIA occasionally updates their image versions, which might not be reflected in older guides.

2. Update Your Dockerfile

Modify your Dockerfile to use the latest TensorRT image version. For example, if the latest version is r8.5.3.0-runtime, update your Dockerfile accordingly:

DockerfileCopy codeFROM nvcr.io/nvidia/l4t-tensorrt:r8.5.3.0-runtime

3. Clear Docker Cache

Sometimes, Docker’s cached data can cause issues. Clear the Docker build cache to ensure you’re fetching the latest metadata:

bashCopy codedocker builder prune

4. Explicitly Specify VPI Version

If your Dockerfile includes Visual Programming Interface (VPI) installation, specify the VPI version explicitly to avoid compatibility issues:

DockerfileCopy codeRUN apt-get update && apt-get install -y \
    vpi2-dev=<specific-version> \
    libnvvpi2=<specific-version> \
    vpi2-samples=<specific-version>

Use the apt-cache policy command to find available versions:

bashCopy codeapt-cache policy vpi2-dev

5. Rebuild the Docker Image

After making these changes, rebuild your Docker image:

bashCopy codedocker build --network=host -t deepstream_image:jetson .

To read this article on this page.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.