Running LLM on Nvidia DRIVE AGX Orin Developer Kit

Please provide the following info (tick the boxes after creating this topic):
Software Version
[*] DRIVE OS 6.0.8.1
DRIVE OS 6.0.6
DRIVE OS 6.0.5
DRIVE OS 6.0.4 (rev. 1)
DRIVE OS 6.0.4 SDK
other

Target Operating System
[*] Linux
QNX
other

Hardware Platform
DRIVE AGX Orin Developer Kit (940-63710-0010-300)
DRIVE AGX Orin Developer Kit (940-63710-0010-200)
[*] DRIVE AGX Orin Developer Kit (940-63710-0010-100)
DRIVE AGX Orin Developer Kit (940-63710-0010-D00)
DRIVE AGX Orin Developer Kit (940-63710-0010-C00)
DRIVE AGX Orin Developer Kit (not sure its number)
other

SDK Manager Version
[*] 1.9.3.10904
other

Host Machine Version
native Ubuntu Linux 20.04 Host installed with SDK Manager
[*] native Ubuntu Linux 20.04 Host installed with DRIVE OS Docker Containers
native Ubuntu Linux 18.04 Host installed with DRIVE OS Docker Containers
other

Is there a way that running LLM locally on Nvidia DRIVE AGX Orin Developer Kit is officially supported by Nvidia?

If not, then are there any other approaches that could help running LLM locally on Nvidia DRIVE AGX Orin Developer Kit?

Dear @crathin,
See if Advancing Automotive AI With Large Language Models and Vision Language Models on NVIDIA DRIVE | NVIDIA On-Demand helpful?

1 Like

Hi @SivaRamaKrishnaNV , we went through the link you provided and the * Supported Hardware, Models, and other Software page on the git repo states that the OS architecture required is “TensorRT-LLM requires Linux x86_64 or Windows.” But the Nvidia DRIVE AGX orin board is having aarch64 cpu architecture and OS. Is there any way to support aarch64?

Dear @crathin,
I have not tested building TensorRT LLM on Drive Orin. Let me check it and get back to you. Thanks

Dear @crathin,
We don’t have TensorRT LLM release available for devzone.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.