Issue Building a Custom PyTorch Docker Image on Nvidia Jetson AGX Orin

"I have an Nvidia Jetson AGX Orin, and I am looking for a PyTorch image with Python 3.9 and CUDA 11.8. However, it seems that NGC only has an L4T image with Python 3.8 and CUDA 11.4. I tried following the steps in this guide to build my own image, but I encountered an error: KeyError: "couldn't find package: pytorch".

The command I used is as follows:

bash

CUDA_VERSION=11.8 PYTHON_VERSION=3.9 PYTORCH_VERSION=2.3 jetson-containers build --name=l4t-python39 pytorch

I’m not sure where I went wrong, and I’m quite confused. Here are the relevant details of my setup:

  • JetPack: 5.1.2-b104
  • CUDA: 11.8"

Hi,
Here are some suggestions for the common issues:

1. Performance

Please run the below command before benchmarking deep learning use case:

$ sudo nvpmodel -m 0
$ sudo jetson_clocks

2. Installation

Installation guide of deep learning frameworks on Jetson:

3. Tutorial

Startup deep learning tutorial:

4. Report issue

If these suggestions don’t help and you want to report an issue to us, please attach the model, command/step, and the customized app (if any) with us to reproduce locally.

Thanks!

Hi @carolyuu
I am currently using the nvcr.io/nvidia/l4t-pytorch:r35.2.1-pth2.0-py3 image to launch a container, and I’ve noticed that it includes CUDA 11.4 and Python 3.8. I would like to use CUDA 11.8 and Python 3.9, as I plan to run the AWQ-based LLM model (hugging-quants--Meta-Llama-3.1-8B-Instruct-AWQ-INT4) in this environment. Since I haven’t found a suitable image, I’m considering building one myself. However, as I mentioned previously, I encountered a KeyError: "couldn't find package: pytorch" error during the build process. I’m not sure which step may be causing this issue, or if you might recommend a better approach than building it myself?"

Hi,

How do you build PyTorch? Do you build it in the Dokcerfile?

Based on your use case, is Python3.10 an option for you?
If yes, please upgrade to JetPack 6.0 or 6.1 and there are some containers with the PyTorch pre-installed.

For example: nvcr.io/nvidia/pytorch:24.10-py3-igpu
PyTorch | NVIDIA NGC

Thanks.

Hi @AastaLLL
I followed the steps from jetson-containers/docs/build.md at master · dusty-nv/jetson-containers · GitHub to build the image, using the command CUDA_VERSION=11.8 PYTHON_VERSION=3.9 PYTORCH_VERSION=2.3 jetson-containers build --name=l4t-python39 pytorch. Based on my understanding, jetson-containers should construct an image of the specified version according to the base image and the given CUDA_VERSION, PYTHON_VERSION, and PYTORCH_VERSION. However, I encountered an error.

As you suggested, I noticed that JetPack 6.0 indeed includes Python 3.10, but it uses CUDA version 12.x. My use case has not been tested with CUDA 12.x, so I’d like to build an image with the specific versions I need.

Hi,

Is the python3 in your environment linked to python 3.9?
If not, could you update the build script to python3.9 and try it again?

Thanks.

Hi @AastaLLL

I recently discovered an image, dustynv/text-generation-webui:r35.4.1-cp310, and by upgrading PyTorch, I was able to resolve the issues I was facing. However, it is not a perfect solution. My original plan was to use the AWQ quantized model, but unfortunately, autoAWQ is not supported on JetPack 5.1.2. As a result, I temporarily opted for a different quantized model as a substitute.

While finding this ready-made image has temporarily addressed my problem, I would prefer a more fundamental solution. Therefore, I’d like to ask if you could provide more detailed guidance on how to use jetson-containers/docs/build.md at master · dusty-nv/jetson-containers · GitHub.

For example:

  • If I want to use autoAWQ, transformers, and PyTorch 2.2 together, what commands should I use to build the container?
  • Alternatively, if I use transformers:r35.4.1 as the base image and want to add autoAWQ and PyTorch 2.2, what are the proper commands to do so?

So far, I haven’t been able to successfully create my own image using the instructions in the build documentation above. I’m a bit confused and wondering if I might have misunderstood something.