pcha
1
Hi all,
I’m trying to install Nemo on pytorch image as a base. I’m running the commands found in the dockerfile for Nemo image.
However, it fails because it can’t find triton package to install. I’m guessing it’s because it can’t find aarch64 version of its whl.
- Is there somewhere else that I can find the prebuilt triton whl for jetson?
- In the end, I want to put together Nemo, ollama, and pytorch as one image. Any advice on how I should go about it?
Hi,
It’s recommended to start with the Triton image as a base and install PyTorch with our document.
The latest Triton image for Jetson is nvcr.io/nvidia/tritonserver:24.07-py3-igpu*
.
Document for installing PyTorch:
Thanks.
pcha
4
Is Triton Inference Server the same thing as OpenAI triton lang?
Is the image even for jetson as well?
Hi,
They are different libraries.
Do you want the OpenAI Triton-lang?
If yes, although we don’t have a prebuilt, you should be able to build it from the source.
Thanks.
system
Closed
7
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.