Fail to install mlc llm

Hi, I’m trying to install mlc-llm on my Jetson agx orin.
Environment: Jetson agx orin、CUDA 12.2
I ran the following commands:

conda create -n mlc python=3.11
conda activate mlc

and then

python -m pip install --pre -U -f https://mlc.ai/wheels mlc-llm-nightly-cu122 mlc-ai-nightly-cu122

It seems that I have successfully installed it.

Installing collected packages: mlc-llm-nightly-cu122, mlc-ai-nightly-cu122
Successfully installed mlc-ai-nightly-cu122-0.1 mlc-llm-nightly-cu122-0.1

But I failed to import mlc_llm

python -c "import mlc_llm; print(mlc_llm)"
Traceback (most recent call last):
  File "<string>", line 1, in <module>
ModuleNotFoundError: No module named 'mlc_llm'

Hi,

Usually, the Linux package is built for x86 devices.
Could you help to verify this?

Maybe you can try to build it from the source to get a Jetson package:

https://llm.mlc.ai/docs/install/mlc_llm#option-2-build-from-source

Thanks.

Thanks for your reply. I successfully built it from the source. But when I tried to run a demo, the engine failed to find a cuda device.
I installed pytorch and ran torch.cuda.is_available() to verify, but it returned with False.
Is this problem caused by installing CUDA 12.2 on the top of JetPack5?

Hi,

If you use this package, it’s expected to work.
But we only support CUDA upgrades on JetPack 5, there is no cuDNN/TensorRT package for CUDA 12+JetPack 5.

For PyTorch, you will also need to build it from the source.
Since the default package we shared here is built with CUDA 11.4.

Thanks.

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.