Jetson Container text-generation-webui not loading models

I’m developing an app that use LLM Endpoints to request for data. I’m trying to use the jetson container text-generation-webui just as described here:

https://www.jetson-ai-lab.com/tutorial_text-generation.html

Unfortunately I cannot load any model, no matter what model, no matter the model loader. See image:

I’ve been working successfully with stable-diffusion-webui container and I would like to be able to do the same with text-generation-webui

The specs of my Jetson Orin AGX are:

Any help would be appreciated.

Hi @esteban.gallardo, it looks like you are trying to load a GPTQ model with the llama.cpp loader - llama.cpp only works with GGUF models. IIRC, text-generation-webui expects the .gguf model file to be saved under it’s model directory (/data/models/text-generation-webui)

By the way, for llama.cpp loader you should increase the n-gpu-layers setting (typically to the max), otherwise it will not use GPU.

Unfortunatelly that didn’t work.

Continues the discussion at Enabling API for jetson container AudioCraft - Jetson & Embedded Systems / Jetson AGX Orin - NVIDIA Developer Forums

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.