i’m trying to learn how to use buildx and bake to combine multiple images…
this is a proper follow up to my 2 attempts previously https://forums.developer.nvidia.com/t/jetson-containers-keep-exiting-with-error-code-but-not-sure-what-it-means/320624
and this one i guess since the idea was from here https://forums.developer.nvidia.com/t/combining-jetson-containers/292883/9
but turns out… buildx bake is more useful to achieve my goal… but i could not figure out how to get tenssorrt to get installed.
the good news is initially i would somehow need to combine different images from dusty and combine into one image. but apparently it makes more sense to build all from scratch… and thanks still to dusty’s images… i frankenstein-ed his dockerfiles but got stuck at the following. any suggestions?
❯ docker buildx bake --push --file ./docker-bake.hcl --provenance=mode=max --sbom=true
[+] Building 780.3s (30/34) docker-container:combined
=> [internal] load local bake definitions 0.0s
=> => reading ./docker-bake.hcl 245B / 245B 0.0s
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 5.57kB 0.0s
=> resolve image config for docker-image://docker.io/docker/buildkit-syft-scanner:stable-1 1.9s
=> [auth] docker/buildkit-syft-scanner:pull token for registry-1.docker.io 0.0s
=> [internal] load metadata for docker.io/dustynv/torchao:0.11.0-r36.4.0-cu128-24.04 1.6s
=> [auth] dustynv/torchao:pull token for registry-1.docker.io 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> CACHED [ 1/24] FROM docker.io/dustynv/torchao:0.11.0-r36.4.0-cu128-24.04@sha256:617252e9861efc5e49e9be367bfc62770d5e510e2a6bca72bdf819ddd8e74a5c 0.0s
=> => resolve docker.io/dustynv/torchao:0.11.0-r36.4.0-cu128-24.04@sha256:617252e9861efc5e49e9be367bfc62770d5e510e2a6bca72bdf819ddd8e74a5c 0.0s
=> docker-image://docker.io/docker/buildkit-syft-scanner:stable-1 0.4s
=> => resolve docker.io/docker/buildkit-syft-scanner:stable-1 0.3s
=> [internal] load build context 0.0s
=> => transferring context: 99B 0.0s
=> [ 2/24] RUN apt update && apt upgrade -y 33.8s
=> [ 3/24] WORKDIR /opt 0.1s
=> [ 4/24] RUN set -ex && apt-get update && apt-get install -y --no-install-recommends locales locales-all tzdata && locale-gen en_US 8.5s
=> [ 5/24] COPY tarpack /usr/local/bin/ 0.1s
=> [ 6/24] RUN set -ex && wget -O - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | gpg --dearmor - | tee /usr/share/keyrings/kitware-archive-k 29.7s
=> [ 7/24] RUN set -ex && pip3 install --upgrade --force-reinstall --no-cache-dir --verbose cmake && cmake --version && which cmake 72.5s
=> [ 8/24] RUN apt-get update && apt-get install -y --no-install-recommends unzip wget curl jq 11.9s
=> [ 9/24] RUN LATEST_VERSION=$(curl -s https://api.github.com/repos/ninja-build/ninja/releases/latest | jq -r .tag_name) && wget -q "https://github.com/ninja-build/n 7.7s
=> [10/24] RUN BAZELISK_RELEASE=$(wget -qO- https://api.github.com/repos/bazelbuild/bazelisk/releases/latest | grep -Po '"tag_name": "\K.*?(?=")') && BAZELISK_URL="ht 2.6s
=> [11/24] RUN bazel --version 9.6s
=> [12/24] RUN pip3 install -U opencv-python transformers 26.4s
=> [13/24] RUN apt update && apt upgrade -y && apt install xfe -y && apt install apt-utils -y 172.7s
=> [14/24] RUN pip3 install -U cmake numpy 6.5s
=> [15/24] RUN apt-get update && apt-get install libcairo2-dev -y 19.2s
=> [16/24] RUN pip3 install -U pycairo 16.9s
=> [17/24] COPY install-torch.sh build-torch.sh /tmp/pytorch/ 0.1s
=> [18/24] RUN /tmp/pytorch/install-torch.sh 34.8s
=> [19/24] RUN pip3 install -U xformers onnxruntime-gpu triton lm_eval 271.3s
=> [20/24] RUN pip3 install -U diffusers flash-attn sageattention torchao 48.1s
=> ERROR [21/24] RUN pip3 install tensorrt 3.9s
------
> [21/24] RUN pip3 install tensorrt:
0.727 Looking in indexes: https://pypi.jetson-ai-lab.dev/jp6/cu128
1.857 Collecting tensorrt
2.298 Downloading https://pypi.jetson-ai-lab.dev/root/pypi/%2Bf/6f1/e2beae4c161d3/tensorrt-10.8.0.43.tar.gz (35 kB)
2.332 Preparing metadata (setup.py): started
2.688 Preparing metadata (setup.py): finished with status 'done'
3.218 Collecting tensorrt_cu12==10.8.0.43 (from tensorrt)
3.486 Downloading https://pypi.jetson-ai-lab.dev/root/pypi/%2Bf/0ef/b7ba28afde082/tensorrt_cu12-10.8.0.43.tar.gz (18 kB)
3.506 Preparing metadata (setup.py): started
3.731 Preparing metadata (setup.py): finished with status 'error'
3.741 error: subprocess-exited-with-error
3.741
3.741 × python setup.py egg_info did not run successfully.
3.741 │ exit code: 1
3.741 ╰─> [6 lines of output]
3.741 Traceback (most recent call last):
3.741 File "<string>", line 2, in <module>
3.741 File "<pip-setuptools-caller>", line 34, in <module>
3.741 File "/tmp/pip-install-93mnc6ev/tensorrt-cu12_97430d339ad24026a84446fa6b24f69a/setup.py", line 71, in <module>
3.741 raise RuntimeError("TensorRT does not currently build wheels for Tegra systems")
3.741 RuntimeError: TensorRT does not currently build wheels for Tegra systems
3.741 [end of output]
3.741
3.741 note: This error originates from a subprocess, and is likely not a problem with pip.
3.743 error: metadata-generation-failed
3.743
3.743 × Encountered error while generating package metadata.
3.743 ╰─> See above for output.
3.743
3.743 note: This is an issue with the package mentioned above, not pip.
3.743 hint: See above for details.
------
Dockerfile:152
--------------------
150 |
151 | # install tensorrt
152 | >>> RUN pip3 install tensorrt
153 |
154 | # Clone the repository:
--------------------
ERROR: failed to solve: process "/bin/sh -c pip3 install tensorrt" did not complete successfully: exit code: 1
╭─░▒▓ ~/SSD/apps/build ▓▒░································································································░▒▓ 1 ✘ took 13m 0s at 20:32:40 ▓▒░─╮
╰─ ─╯
docker-bake.hcl copy.txt (245 Bytes)
attached my bake file and dockerfile . most of the codes is taken from dusty’s Dockerfile
Dockerfile copy.txt (5.4 KB)
wait… i’m not sure what i did but it uploaded everything? so… that means it works right?
[+] Building 580.9s (16/16) FINISHED docker-container:combined
=> [internal] load local bake definitions 0.0s
=> => reading ./docker-bake.hcl 245B / 245B 0.0s
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 5.64kB 0.0s
=> resolve image config for docker-image://docker.io/docker/buildkit-syft-scanner:stable-1 2.5s
=> [auth] docker/buildkit-syft-scanner:pull token for registry-1.docker.io 0.0s
=> [internal] load metadata for docker.io/dustynv/torch2trt:r36.4.0-cu128-24.04 1.7s
=> [auth] dustynv/torch2trt:pull token for registry-1.docker.io 0.0s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [combined 1/5] FROM docker.io/dustynv/torch2trt:r36.4.0-cu128-24.04@sha256:5c73fe54ec3e2acd089045935d560b744a70adcc44967f4670ecc3f1d88e85da 129.0s
=> => resolve docker.io/dustynv/torch2trt:r36.4.0-cu128-24.04@sha256:5c73fe54ec3e2acd089045935d560b744a70adcc44967f4670ecc3f1d88e85da 0.0s
=> => sha256:bb3984e7b09b4cc084a8e4d51d015c87d8cb4d43d7d9c0e87d97a97ed65b18b4 14.54kB / 14.54kB 0.3s
=> => sha256:4429afa174fcde95fc50e1856792c397f56073b3cb7c55727f4d8c2ae1dcd79f 2.63MB / 2.63MB 0.5s
=> => sha256:66b614553fd6e7a11d6b7ddb4d61247bc33fbfbbfdf366bf4858b5c427d1652b 902B / 902B 0.7s
=> => sha256:9f651f98c67e557bc142dcb7c12984020ea3fed937f37948a1b72645d21a888a 348B / 348B 0.8s
=> => sha256:bbb9080355c6b89024752c8e78a28921f9ed81d45df8a51547333557418da9f5 458.38MB / 458.38MB 16.2s
=> => sha256:0d7ea8c35d9255ccb121bc4eaf264f17372907b36a3b1e7155f510a116a8d939 1.31GB / 1.31GB 39.3s
=> => sha256:5dc003c8b2c15c6e526c8a4d145f3a82e65dd12c60c35d398e0ef4ecdea6f98b 7.28MB / 7.28MB 0.7s
=> => sha256:4a62cd4c90efd8fea79b8c0ccb6a929557ca2c86ae6ac8eba6ccf9d50174e45e 1.21MB / 1.21MB 1.0s
=> => sha256:d3cc80b0c24c741141aa58376e68682acccdcad9503ec3c789daff9a86c2b1d9 683B / 683B 0.4s
=> => sha256:9dc465cd571e78df1e7ae778e407a01dece27e3b617789cc9f29ace3fdca2a50 368.20MB / 368.20MB 10.0s
=> => sha256:95b25b82e676262dbe8ee3e42eb587eb488e8fbfad57ece3e38eb745760715d3 1.34kB / 1.34kB 0.3s
=> => sha256:2fd8c4e066b1df1e1147fc518ff33bb532d592179bb579ed3fa26652cd87659e 14.75MB / 14.75MB 0.8s
=> => sha256:e62701476730284a6ca10b48d7a5163793e20081d194dcb9d2d4f4c97c3765c3 26.82MB / 26.82MB 1.2s
=> => sha256:0122f6d45e3f1e77daafc1dd0f551467804b8672c42814011c54e96b59a411ce 18.09MB / 18.09MB 0.8s
=> => sha256:c53c4ec2962f0880a8032a30c36272fdd62873ee3e82bd128e992dffbbd2b67c 471B / 471B 0.3s
=> => sha256:7b989fe0da2d33afd35798023a01f90a0fc26a6d5455ba53fbfd41a06e1b9f65 31.16MB / 31.16MB 1.0s
=> => sha256:8d46a4cccc4b234ba5ab898afa5f12582a1e9df7759fd2ce4b1508d779689471 891B / 891B 0.3s
=> => sha256:8d5973e4da2e0981d905ec73b2e95d96c06ccc381cf4bc88b64d44ed61dc5564 120.77kB / 120.77kB 0.3s
=> => sha256:93264d872c7740dbe5cf513220a9e2507fa3cb4a1f0eb9057560bf7020643237 857.71MB / 857.71MB 30.8s
=> => sha256:db0a6f1a719234e6a42fa8cb38d61e684321d6c193147b6700806aea6f2a6f10 31.95MB / 31.95MB 1.7s
=> => sha256:292d6b336f57bfe299df313ce9238f96833506b709468dbea29768b2359f0d39 2.12GB / 2.12GB 35.4s
=> => sha256:3018bc8db61495e7c1e99caed6c655d02c5ffc8718d3344efe978699461b3897 570B / 570B 0.3s
=> => sha256:978046e63340c40d30e1b984d612d0f5ab91fbd74a2389de9840489602750ce5 827B / 827B 0.3s
=> => sha256:bea67ad17bb1e2546ea2ea86571b07be6b6603c33fd70bfc8e5674d40022d866 261.25MB / 261.25MB 9.4s
=> => extracting sha256:bea67ad17bb1e2546ea2ea86571b07be6b6603c33fd70bfc8e5674d40022d866 11.9s
=> => extracting sha256:978046e63340c40d30e1b984d612d0f5ab91fbd74a2389de9840489602750ce5 0.0s
=> => extracting sha256:3018bc8db61495e7c1e99caed6c655d02c5ffc8718d3344efe978699461b3897 0.0s
=> => extracting sha256:292d6b336f57bfe299df313ce9238f96833506b709468dbea29768b2359f0d39 26.4s
=> => extracting sha256:db0a6f1a719234e6a42fa8cb38d61e684321d6c193147b6700806aea6f2a6f10 0.4s
=> => extracting sha256:93264d872c7740dbe5cf513220a9e2507fa3cb4a1f0eb9057560bf7020643237 7.7s
=> => extracting sha256:8d5973e4da2e0981d905ec73b2e95d96c06ccc381cf4bc88b64d44ed61dc5564 0.0s
=> => extracting sha256:8d46a4cccc4b234ba5ab898afa5f12582a1e9df7759fd2ce4b1508d779689471 0.0s
=> => extracting sha256:7b989fe0da2d33afd35798023a01f90a0fc26a6d5455ba53fbfd41a06e1b9f65 2.5s
=> => extracting sha256:c53c4ec2962f0880a8032a30c36272fdd62873ee3e82bd128e992dffbbd2b67c 0.0s
=> => extracting sha256:0122f6d45e3f1e77daafc1dd0f551467804b8672c42814011c54e96b59a411ce 1.0s
=> => extracting sha256:e62701476730284a6ca10b48d7a5163793e20081d194dcb9d2d4f4c97c3765c3 1.9s
=> => extracting sha256:2fd8c4e066b1df1e1147fc518ff33bb532d592179bb579ed3fa26652cd87659e 4.6s
=> => extracting sha256:95b25b82e676262dbe8ee3e42eb587eb488e8fbfad57ece3e38eb745760715d3 0.1s
=> => extracting sha256:9dc465cd571e78df1e7ae778e407a01dece27e3b617789cc9f29ace3fdca2a50 14.6s
=> => extracting sha256:d3cc80b0c24c741141aa58376e68682acccdcad9503ec3c789daff9a86c2b1d9 0.0s
=> => extracting sha256:4a62cd4c90efd8fea79b8c0ccb6a929557ca2c86ae6ac8eba6ccf9d50174e45e 0.1s
=> => extracting sha256:5dc003c8b2c15c6e526c8a4d145f3a82e65dd12c60c35d398e0ef4ecdea6f98b 0.4s
=> => extracting sha256:0d7ea8c35d9255ccb121bc4eaf264f17372907b36a3b1e7155f510a116a8d939 12.3s
=> => extracting sha256:bbb9080355c6b89024752c8e78a28921f9ed81d45df8a51547333557418da9f5 6.7s
=> => extracting sha256:9f651f98c67e557bc142dcb7c12984020ea3fed937f37948a1b72645d21a888a 0.0s
=> => extracting sha256:66b614553fd6e7a11d6b7ddb4d61247bc33fbfbbfdf366bf4858b5c427d1652b 0.0s
=> => extracting sha256:4429afa174fcde95fc50e1856792c397f56073b3cb7c55727f4d8c2ae1dcd79f 0.3s
=> => extracting sha256:bb3984e7b09b4cc084a8e4d51d015c87d8cb4d43d7d9c0e87d97a97ed65b18b4 0.0s
=> CACHED docker-image://docker.io/docker/buildkit-syft-scanner:stable-1 0.4s
=> => resolve docker.io/docker/buildkit-syft-scanner:stable-1 0.4s
=> [combined 2/5] RUN pip3 install tensorrt torch2trt 2.3s
=> [combined 3/5] WORKDIR /opt 0.1s
=> [combined 4/5] RUN git clone https://github.com/comfyanonymous/ComfyUI.git comf && cd /opt/comf && pip3 install -r requirements.txt 121.0s
=> [combined 5/5] RUN ls -a 0.3s
=> [linux/arm64] generating sbom using docker.io/docker/buildkit-syft-scanner:stable-1 36.2s
=> exporting to image 268.5s
=> => exporting layers 26.1s
=> => exporting manifest sha256:7008fba27bda1c679f9c211c697ffddef1704e47428b46d4b0cb3b12ca85f046 0.0s
=> => exporting config sha256:44dcd6fd4a4683cf0f54c840bab8c83fb395b3549747d8cc1d7c1d915ff62410 0.0s
=> => exporting attestation manifest sha256:e635c6994a474fb6981636e78072c6848479701fa1223a1f6e6bd8ca656f6909 0.0s
=> => exporting manifest list sha256:8b129bc552dda785f86d9e9791760955739754d77e94fec694f1ee3233f75c82 0.0s
=> => pushing layers 190.5s
=> => pushing manifest for docker.io/kairin/jetson-containers:dustynv-torchao-0.11.0-r36.4.0-cu128-24.04-latest@sha256:8b129bc552dda785f86d9e9791760955739754d77e94fec694f 3.1s
=> [auth] kairin/jetson-containers:pull,push token for registry-1.docker.io 0.0s
╭─░▒▓ ~/SSD/apps/build ▓▒░··································································································░▒▓ ✔ took 9m 41s at 21:01:34 ▓▒░─╮
╰─ ─╯
no.
still could not get tenssorrt to install.
❯ docker buildx bake --push --file ./docker-bake.hcl
[+] Building 6.1s (28/32) docker-container:combined
=> [internal] load local bake definitions 0.0s
=> => reading ./docker-bake.hcl 189B / 189B 0.0s
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 5.63kB 0.0s
=> resolve image config for docker-image://docker.io/docker/buildkit-syft-scanner:stable-1 0.9s
=> [internal] load metadata for docker.io/dustynv/torchao:0.11.0-r36.4.0-cu128-24.04 0.4s
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> docker-image://docker.io/docker/buildkit-syft-scanner:stable-1 0.4s
=> => resolve docker.io/docker/buildkit-syft-scanner:stable-1 0.3s
=> [internal] load build context 0.0s
=> => transferring context: 99B 0.0s
=> [stage-1 1/24] FROM docker.io/dustynv/torchao:0.11.0-r36.4.0-cu128-24.04@sha256:617252e9861efc5e49e9be367bfc62770d5e510e2a6bca72bdf819ddd8e74a5c 0.0s
=> => resolve docker.io/dustynv/torchao:0.11.0-r36.4.0-cu128-24.04@sha256:617252e9861efc5e49e9be367bfc62770d5e510e2a6bca72bdf819ddd8e74a5c 0.0s
=> CACHED [stage-1 2/24] RUN apt update && apt upgrade -y 0.0s
=> CACHED [stage-1 3/24] WORKDIR /opt 0.0s
=> CACHED [stage-1 4/24] RUN set -ex && apt-get update && apt-get install -y --no-install-recommends locales locales-all tzdata && lo 0.0s
=> CACHED [stage-1 5/24] COPY tarpack /usr/local/bin/ 0.0s
=> CACHED [stage-1 6/24] RUN set -ex && wget -O - https://apt.kitware.com/keys/kitware-archive-latest.asc 2>/dev/null | gpg --dearmor - | tee /usr/share/keyrings/kit 0.0s
=> CACHED [stage-1 7/24] RUN set -ex && pip3 install --upgrade --force-reinstall --no-cache-dir --verbose cmake && cmake --version && which cmake 0.0s
=> CACHED [stage-1 8/24] RUN apt-get update && apt-get install -y --no-install-recommends unzip wget curl jq 0.0s
=> CACHED [stage-1 9/24] RUN LATEST_VERSION=$(curl -s https://api.github.com/repos/ninja-build/ninja/releases/latest | jq -r .tag_name) && wget -q "https://github.co 0.0s
=> CACHED [stage-1 10/24] RUN BAZELISK_RELEASE=$(wget -qO- https://api.github.com/repos/bazelbuild/bazelisk/releases/latest | grep -Po '"tag_name": "\K.*?(?=")') && B 0.0s
=> CACHED [stage-1 11/24] RUN bazel --version 0.0s
=> CACHED [stage-1 12/24] RUN pip3 install -U opencv-python transformers 0.0s
=> CACHED [stage-1 13/24] RUN apt update && apt upgrade -y && apt install xfe -y && apt install apt-utils -y 0.0s
=> CACHED [stage-1 14/24] RUN pip3 install -U cmake numpy 0.0s
=> CACHED [stage-1 15/24] RUN apt-get update && apt-get install libcairo2-dev -y 0.0s
=> CACHED [stage-1 16/24] RUN pip3 install -U pycairo 0.0s
=> CACHED [stage-1 17/24] COPY install-torch.sh build-torch.sh /tmp/pytorch/ 0.0s
=> CACHED [stage-1 18/24] RUN /tmp/pytorch/install-torch.sh 0.0s
=> CACHED [stage-1 19/24] RUN pip3 install -U xformers onnxruntime-gpu triton lm_eval 0.0s
=> CACHED [stage-1 20/24] RUN pip3 install -U diffusers flash-attn sageattention torchao 0.0s
=> ERROR [stage-1 21/24] RUN pip3 install tensorrt 4.4s
------
> [stage-1 21/24] RUN pip3 install tensorrt:
0.690 Looking in indexes: https://pypi.jetson-ai-lab.dev/jp6/cu128
2.090 Collecting tensorrt
2.598 Downloading https://pypi.jetson-ai-lab.dev/root/pypi/%2Bf/6f1/e2beae4c161d3/tensorrt-10.8.0.43.tar.gz (35 kB)
2.629 Preparing metadata (setup.py): started
2.992 Preparing metadata (setup.py): finished with status 'done'
3.694 Collecting tensorrt_cu12==10.8.0.43 (from tensorrt)
4.031 Downloading https://pypi.jetson-ai-lab.dev/root/pypi/%2Bf/0ef/b7ba28afde082/tensorrt_cu12-10.8.0.43.tar.gz (18 kB)
4.042 Preparing metadata (setup.py): started
4.258 Preparing metadata (setup.py): finished with status 'error'
4.270 error: subprocess-exited-with-error
4.270
4.270 × python setup.py egg_info did not run successfully.
4.270 │ exit code: 1
4.270 ╰─> [6 lines of output]
4.270 Traceback (most recent call last):
4.270 File "<string>", line 2, in <module>
4.270 File "<pip-setuptools-caller>", line 34, in <module>
4.270 File "/tmp/pip-install-asdr8pqg/tensorrt-cu12_76582ea2f45c4072ac849653dc278ac2/setup.py", line 71, in <module>
4.270 raise RuntimeError("TensorRT does not currently build wheels for Tegra systems")
4.270 RuntimeError: TensorRT does not currently build wheels for Tegra systems
4.270 [end of output]
4.270
4.270 note: This error originates from a subprocess, and is likely not a problem with pip.
4.274 error: metadata-generation-failed
4.274
4.274 × Encountered error while generating package metadata.
4.274 ╰─> See above for output.
4.274
4.274 note: This is an issue with the package mentioned above, not pip.
4.274 hint: See above for details.
------
Dockerfile:154
--------------------
152 | # install tensorrt
153 | # RUN pip3 install torch2trt
154 | >>> RUN pip3 install tensorrt
155 |
156 | # Clone the repository:
--------------------
ERROR: failed to solve: process "/bin/sh -c pip3 install tensorrt" did not complete successfully: exit code: 1
╭─░▒▓ ~/SSD/apps/build ▓▒░····································································································░▒▓ 1 ✘ took 6s at 21:23:52 ▓▒░─╮
╰─ ─╯
Hi,
Could you check if you can get TensorRT installed by following the instructions from the below Dockerfile?
#---
# name: tensorrt
# group: cuda
# depends: [cuda, cudnn, python]
# config: config.py
# test: test.sh
#---
ARG BASE_IMAGE
FROM ${BASE_IMAGE}
ARG TENSORRT_URL \
TENSORRT_DEB \
TENSORRT_PACKAGES
RUN set -ex && \
echo "Downloading ${TENSORRT_DEB}" && \
mkdir -p /tmp/tensorrt && \
cd /tmp/tensorrt && \
wget --quiet --show-progress --progress=bar:force:noscroll ${TENSORRT_URL} && \
dpkg -i *.deb && \
This file has been truncated. show original
Thanks.
when using “jetson-containers” to run the following command it builds the following:
PYTHON_VERSION=3.12 jetson-containers run --volume /media/SSD/apps:/ppp -it --rm --user root $(autotag tensorrt) /bin/bash
❯ docker images
REPOSITORY TAG IMAGE ID CREATED SIZE
tensorrt r36.4.3 4771112137c5 29 hours ago 10.1GB
tensorrt r36.4.3-tensorrt 4771112137c5 29 hours ago 10.1GB
tensorrt r36.4.3-python fcdd8098cb1a 29 hours ago 7.06GB
tensorrt r36.4.3-cudnn_9.4 2d70a839e37d 29 hours ago 6.99GB
tensorrt r36.4.3-cuda_12.6 2d4e389a49c6 29 hours ago 5.93GB
tensorrt r36.4.3-pip_cache_cu126 6cbe3685dd05 29 hours ago 744MB
tensorrt r36.4.3-build-essential e3cd8720ee6f 29 hours ago 744MB
kairin/jetson-containers-001 dustynv-torchao-0.11.0-r36.4.0-cu128-24.04 8e6ca95a2b89 30 hours ago 7.94GB
kairin/jetson-containers-001 dustynv-opencv-4.11.0-r36.4.0-cu128-24.04 c8d95d1f9bcf 30 hours ago 7.38GB
kairin/jetson-containers-001 dustynv-torch2trt-r36.4.0-cu128-24.04 cfa1687e062b 31 hours ago 11.1GB
kairin/jetson-containers-001 dustynv-cupy-r36.4.0-cu128-24.04 6aadca8f93de 32 hours ago 5.53GB
kairin/jetson-containers-001 dustynv-pycuda-r36.4.0-cu128-24.04 93016b8e53e7 32 hours ago 5.41GB
kairin/jetson-containers-001 dustynv-cuda-python-r36.4.0-cu128-24.04 b253938a86ba 32 hours ago 5.51GB
kairin/jetson-containers-001 dustynv-onnxruntime-1.22-r36.4.0-cu128-24.04 e57feadcbd34 32 hours ago 10.7GB
kairin/jetson-containers-001 dustynv-flashinfer-0.2.3-r36.4.0-cu128-24.04 41aaccd4967b 47 hours ago 8.77GB
kairin/jetson-containers-001 dustynv-bitsandbytes-0.45.4-r36.4.0-cu128-24.04 755411b19a58 47 hours ago 11.2GB
moby/buildkit buildx-stable-1 0959b055a013 2 weeks ago 209MB
ubuntu 22.04 560582227a09 5 weeks ago 69.2MB
❯ ./jetsonrun.sh
zsh: no such file or directory: ./jetsonrun.sh
❯ cd ssd
zsh: correct 'ssd' to 'SSD' [nyae]? y
❯ ls
apps docker
❯ cd apps
❯ ls
build for-comf jetsonrun.sh linux-exfat-oot rds-21.0-release.zip
build2 jetson lazygit nerd-fonts simple-digital-signage
❯ ./jetsonrun.sh
Namespace(packages=['tensorrt'], prefer=['local', 'registry', 'build'], disable=[''], user='dustynv', output='/tmp/autotag', quiet=False, verbose=False)
-- L4T_VERSION=36.4.3 JETPACK_VERSION=6.2 CUDA_VERSION=12.6
-- Finding compatible container image for ['tensorrt']
tensorrt:r36.4.3
V4L2_DEVICES:
### DISPLAY environmental variable is already set: ":0"
localuser:root being added to access control list
xauth: file /tmp/.docker.xauth does not exist
+ docker run --runtime nvidia -it --rm --network host --shm-size=8g --volume /tmp/argus_socket:/tmp/argus_socket --volume /etc/enctune.conf:/etc/enctune.conf --volume /etc/nv_tegra_release:/etc/nv_tegra_release --volume /tmp/nv_jetson_model:/tmp/nv_jetson_model --volume /var/run/dbus:/var/run/dbus --volume /var/run/avahi-daemon/socket:/var/run/avahi-daemon/socket --volume /var/run/docker.sock:/var/run/docker.sock --volume /media/SSD/apps/jetson/data:/data -v /etc/localtime:/etc/localtime:ro -v /etc/timezone:/etc/timezone:ro --device /dev/snd -e PULSE_SERVER=unix:/run/user/1000/pulse/native -v /run/user/1000/pulse:/run/user/1000/pulse --device /dev/bus/usb -e DISPLAY=:0 -v /tmp/.X11-unix/:/tmp/.X11-unix -v /tmp/.docker.xauth:/tmp/.docker.xauth -e XAUTHORITY=/tmp/.docker.xauth --device /dev/i2c-0 --device /dev/i2c-1 --device /dev/i2c-2 --device /dev/i2c-3 --device /dev/i2c-4 --device /dev/i2c-5 --device /dev/i2c-6 --device /dev/i2c-7 --device /dev/i2c-8 --device /dev/i2c-9 -v /run/jtop.sock:/run/jtop.sock --name jetson_container_20250307_061531 --volume /media/SSD/apps:/ppp -it --rm --user root tensorrt:r36.4.3 /bin/bash
root@ubuntu:/# ls
bin boot data dev etc home lib media mnt opt ppp proc root run sbin srv sys tmp usr var
root@ubuntu:/# cd opt
root@ubuntu:/opt# ls
nvidia
root@ubuntu:/opt# cd ..
root@ubuntu:/# ls
bin boot data dev etc home lib media mnt opt ppp proc root run sbin srv sys tmp usr var
root@ubuntu:/# cd tmp
root@ubuntu:/tmp# ls
argus_socket install_cuda.sh install_python.sh nv_jetson_model
root@ubuntu:/tmp# cd ,,
bash: cd: ,,: No such file or directory
root@ubuntu:/tmp# pip3 install tensorrt
Using pip 25.0.1 from /usr/local/lib/python3.10/dist-packages/pip (python 3.10)
Non-user install because site-packages writeable
Created temporary directory: /tmp/pip-build-tracker-bq3r_gvd
Initialized build tracking at /tmp/pip-build-tracker-bq3r_gvd
Created build tracker: /tmp/pip-build-tracker-bq3r_gvd
Entered build tracker: /tmp/pip-build-tracker-bq3r_gvd
Created temporary directory: /tmp/pip-install-u2q44w0z
Created temporary directory: /tmp/pip-ephem-wheel-cache-au8j5zxo
Looking in indexes: https://pypi.jetson-ai-lab.dev/jp6/cu126
Requirement already satisfied: tensorrt in /usr/local/lib/python3.10/dist-packages (10.4.0)
Created temporary directory: /tmp/pip-unpack-sxvyh1o_
Removed build tracker: '/tmp/pip-build-tracker-bq3r_gvd'
root@ubuntu:/tmp# pip show tensorrt
Name: tensorrt
Version: 10.4.0
Summary: A high performance deep learning inference library
Home-page: https://developer.nvidia.com/tensorrt
Author: NVIDIA Corporation
Author-email:
License: Proprietary
Location: /usr/local/lib/python3.10/dist-packages
Requires:
Required-by:
Metadata-Version: 2.1
Installer: pip
Classifiers:
License :: Other/Proprietary License
Intended Audience :: Developers
Programming Language :: Python :: 3
Entry-points:
Project-URLs:
root@ubuntu:/tmp#
Hi,
Do you need a python 3.12 TensorRT wrapper?
Thanks.
Well technically the jetson-containers does build (with the python 3.12 and tensorrt 10.4.)
My challenge is when trying to chain multiple containers. One container with opencv tenssorrt cuda and PyTorch.
So right now I do see that my only option is to understand
docker buildx bake
can achieve what I want to get…
But… something will always fail when trying to add tenssort at the end.
Hi,
Have you checked this container?
For example, nvcr.io/nvidia/pytorch:25.02-py3-igpu .
It contains the libraries you need so no need to chain multiple containers.
Thanks.
1 Like