No Existing base Libraries for Python/Cuda?

Heyho i using the Nvidia Jetson Nano jetpack 4.6

my goal was simple : running a smoll llm with cuda like a 2.3B model , with a openai like restapi.

and run from issue to issue, i cant install langchain, PyTorch with Cuda Support and no Transformers etc etc

now is my question why is it so hard to get anything running ? Jetson is ARM so like Raspberry and for the raspberry all is workin fine but withoth Cuda.

i run into compilation errors over and over of the system is call me the architecture is not supportet, or no cuda card found.

so the primary question is how you guys have learned to get the stuff running ? or have you the same issues ?

i tryin now around 4 weeks day over day , tutorial over tutorial and try around to get something usefull running.

the last happy thing was gettin lama.cpp running after days on CPU only

Dragony ^^


Have you checked with OpenAI team to see if they do support Jetson’s environment?
If not, please file a feature request for this first.

We also have a tutorial for Small LLM which can run on Orin Nano: