Unable to use use local model in VS Code with the continue extension

Hello - new here and just started using the system - its amazing

Anyway, Im getting the following error in screenshot attached that I get when I try and query the local Ollama model that I downloaded and configured.

See screenshot atatched also. Thsi is where you choose your model either local or public

I know my model is working because I can query it successfully through OpenWebUI interface

Thanks for any help you might have

Hi, @keithcp

Sorry for the issue you met.
Are you using “continue” extension ? This is forum to support NVIDIA developer tools, your topic seems out of our scope.

Thank you - which forum would you suggest? Isnt this a tool that devlopers might use?

Maybe you can ask in GitHub · Where software is built.