Text to Knowledge Graph - Ollama issues

I’ve installed the Text to Knowledge Graph function on my DGX Spark and it’s not connecting to anything. When I check the container status, I’m seeing that the ollama image status is ‘unhealthy’. A check of the logs shows the following error repeating:

  "Start": "2025-11-05T09:17:09.001471624-05:00",

  "End": "2025-11-05T09:17:09.048296913-05:00",

  "ExitCode": -1,

  "Output": "OCI runtime exec failed: exec failed: unable to start container process: exec: \\"curl\\": executable file not found in $PATH: unknown"

Please assist - I’m spending much more time troubleshooting almost everything every step of the way with this system and it’s getting frustrating.

I fixed this error by installing curl in the ollama container. In progress…

Hi, what do you mean when you say “not connecting to anything”? Does the container come up and you can access http://localhost:3001?

All three containers are running:

CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES

compose-app “docker-entrypoint.s…” 24 hours ago Up 16 hours 0.0.0.0:3001->3000/tcp, [::]:3001->3000/tcp compose-app-1

arangodb:latest “/entrypoint.sh aran…” 24 hours ago Up 16 hours 0.0.0.0:8529->8529/tcp, [::]:8529->8529/tcp compose-arangodb-1

ollama-custom:latest “/entrypoint.sh” 24 hours ago Up 16 hours (healthy) 0.0.0.0:11434->11434/tcp, [::]:11434->11434/tcp ollama-compose

I.can connect to my Open-WebUI instance and the AI Workstation using localhost:[port], but not these. Strangely enough, the netstat on the GB10 shows that it is listening on those ports:

Active Internet connections (only servers)

Proto Recv-Q Send-Q Local Address Foreign Address State

tcp 0 0 127.0.0.1:631 0.0.0.0:* LISTEN

tcp 0 0 127.0.0.1:10000 0.0.0.0:* LISTEN

tcp 0 0 127.0.0.1:10001 0.0.0.0:* LISTEN

tcp 0 0 127.0.0.1:11002 0.0.0.0:* LISTEN

tcp 0 0 127.0.0.1:11000 0.0.0.0:* LISTEN

tcp 0 0 0.0.0.0:3001 0.0.0.0:* LISTEN

tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN

tcp 0 0 0.0.0.0:11434 0.0.0.0:* LISTEN

tcp 0 0 0.0.0.0:12000 0.0.0.0:* LISTEN

tcp 0 0 0.0.0.0:8529 0.0.0.0:* LISTEN

I’m not sure of the next step.

Are you trying to access it locally through Spark or through NV Sync? What error code do you see? 404?
Have you already downloaded a model? docker exec ollama-compose ollama pull <model-name>

I just tested it after a reboot and when I’m in the Spark itself and running the startup script, it works on the local Spark but not remotely on my Mac where I’ve installed NV Sync. I’ve gotten the Open-WebUI and other apps to work through the browser and NV Sync, I’m sure there’s a configuration issue in there somewhere so whatever help you can give would be greatly appreciated.

Oh, the message is ‘connection refused’ on the Mac using Chrome and Safari. It’s not a configuration issue with the Chrome as far as I can see based on the other apps working.

I did download a model.

Let me know what I can do to assist you assisting me. :-)

To better help you can you provide these things:

  1. An image of your custom app settings in NV Sync (example below)
  2. Your NV Sync log after you attempt to connect to the endpoint through NV Sync (found in /Users/<username>/Library/Application Support/NVIDIA/Sync/logs)

And that solved it - just like for Open-WebUI I had to launch in terminal for it to work, but once I did that it worked fine. Thanks for all your help.