Txt2kg Playbook ./start.sh --complete does not start Additional Services (Complete Stack):

Following will not start

Additional Services (Complete Stack):
• Local Pinecone: http://localhost:5081
• Sentence Transformers: http://localhost:8000
• vLLM API: http://localhost:8001

remzi@sparkai:~/dgx-spark-playbooks/nvidia/txt2kg/assets$ ./start.sh --complete
Checking for GPU support…
✓ NVIDIA GPU detected
GPU: NVIDIA GB10, [N/A]
Using Docker Compose V2
Using complete stack (Ollama, vLLM, Pinecone, Sentence Transformers)…

Starting services…
Running: docker compose -f /home/remzi/dgx-spark-playbooks/nvidia/txt2kg/assets/deploy/compose/docker-compose.complete.yml up -d
[+] Running 7/7
✔ Container ollama-compose Running 0.0s
✔ Container vllm-service Started 0.0s
✔ Container compose-arangodb-1 Started 0.1s
✔ Container entity-embeddings Started 0.0s
✔ Container compose-sentence-transformers-1 Started 0.1s
✔ Container compose-arangodb-init-1 Started 0.1s
✔ Container compose-app-1 Started 0.2s

==========================================

txt2kg is now running!

Core Services:
• Web UI: http://localhost:3001
• ArangoDB: http://localhost:8529
• Ollama API: http://localhost:11434

Additional Services (Complete Stack):
• Local Pinecone: http://localhost:5081
• Sentence Transformers: http://localhost:8000
• vLLM API: http://localhost:8001

Next steps:

  1. Pull an Ollama model (if not already done):
    docker exec ollama-compose ollama pull llama3.1:8b

  2. Open http://localhost:3001 in your browser

  3. Upload documents and start building your knowledge graph!

Other options:
• Run frontend in dev mode: ./start.sh --dev-frontend
• Use complete stack: ./start.sh --complete
• View logs: docker compose logs -f

1 Like

If you run docker ps in the command line can you see the containers running?

1 Like

remzi@sparkai:~/dgx-spark-playbooks/dgx-spark-playbooks/nvidia/txt2kg/assets$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
d513d9ffa0ea compose-app “docker-entrypoint.s…” 6 hours ago Up 3 hours 0.0.0.0:3001->3000/tcp, [::]:3001->3000/tcp compose-app-1
e398d0a7150b compose-vllm “/opt/nvidia/nvidia_…” 6 hours ago Restarting (1) 1 second ago vllm-service
37567d7e61a1 compose-sentence-transformers “gunicorn --bind 0.0…” 6 hours ago Up 3 hours 0.0.0.0:8000->80/tcp, [::]:8000->80/tcp compose-sentence-transformers-1
209d2704645c Package pinecone-index · GitHub “/engine” 6 hours ago Restarting (255) 8 seconds ago entity-embeddings
47af05ab9967 ollama-custom:latest “/entrypoint.sh” 5 days ago Up 3 hours (unhealthy) 0.0.0.0:11434->11434/tcp, [::]:11434->11434/tcp ollama-compose
8b17c523f2c4 arangodb:latest “/entrypoint.sh aran…” 5 days ago Up 3 hours 0.0.0.0:8529->8529/tcp, [::]:8529->8529/tcp compose-arangodb-1

That is all I have it running

1 Like

The complete stack is currently not supported but planned for a future release

1 Like

Thank you

I better stop them they are continuously restarting

209d2704645c Package pinecone-index · GitHub “/engine” 6 hours ago Restarting (255) 8 seconds ago

e398d0a7150b compose-vllm “/opt/nvidia/nvidia_…” 6 hours ago Restarting (1) 1 second ago

1 Like

What about using pinecorn DB on AWS Thank you

Yes that should work

Hi @cirit , did you get pinecone to play nice? I just ran the startup, most comes up, pinecone keeps rebooting. Most what I found so far is “exec /engine: exec format error” so it appears the pinecone container is not for ARM, would be weird to have it in the DGX/Arm playbook right?

Please post a reply if you have a fix for rebutu

no luck despite all values are correct. it still fails

OK Thanks. Never mind, there just is no pinecone Docker for ARM and no progress.

The TXT2KG playbook is not compatible with DGX, shame it is in the DGX playbooks repo. Maybe it can work with another vector database, will look into it.

1 Like

Yes and a week ago Nvidia put out a YouTube video showing it off. Wonder how they did it?

Hi! Creator of txt2kg here. I was waiting to get the latest code pushed and it went out last week. Please try it again.

@nvidia3869 is correct that Pinecone is not supported on ARM so I switched to Qdrant for the vector database. Please feel free to create a github issue and I’ll follow up on it.

I can create draft PRs for the fixes and then we will periodically update the official repo.

Thank you for update. I am getting following error

I do not see an option to set Qdrant server setup

The default docker-compose.yml sets it up for you. Is the Qdrant container running? If not, ./start.sh should start that for you

here is what docker ps shows

It worked after restarted second with ./start.sh instead of ./start.sh –complete

it seems ./start.sh --complete
is needed

./start.sh does not start database

./start.sh

./start. sh – compete

Hi team

back to same problem

Embeddings Generation Failed

Failed to generate embeddings: Failed to

generate embeddings: {“error”:“Qdrant server is not available. Please make sure it is running.”}

I donot see steps how to run Qdrant server correctly

Thank you

I will check functionality today. Thanks for modifying docker-compose.yml.

Please also correct

  1. docker-compose.complete.yml
  2. docker-compose.optional.yml
  3. readme instances mentioning pinecone

It does not make sense to only modify docker-compose.yml and not modify descriptions in the accompanying REAMD.md files

Reusing existing playbooks is fine with me, I just wish you would test them against DGX-spark before publishing it in the DGX-spark playbook.

Please do remember that the DGX-spark playbook is a selling point for DGX-spark and referred to as such by Nvidia marketing. Those playbooks should therefor be accurate and complete unless you want to discourage prospective and actual owners or DGX-spark.

Alas… Docker’s ollama-custom:latest image is not available (updates 18hrs ago) but should work with previously built version.

In case anyone would like to try bleeding edge:

  • modify the compose file to say ollama/ollama:latest instead of docker-custom:latest

Sorry to say but the txt2kg playbook has become a bit of a mess:

  1. pinecone / qdrant need not be in docker-compose.yml
    1. qdrant is pulled nor built using start.sh
  2. qdrant is not in docker-compose.complete.yml
    1. qdrant is pulled nor built using start.sh –complete
  3. start.sh does not start qdrant as it is optional
  4. start.sh –complete does not start qdrant as it was not built
  5. Dockerfile does include pinecone, not qdrant

So setup fails. This needs a bit more work, I hope someone feels responsible to put in the work.

I’d suggest taking the playbook offline, correcting errors and test functionality before putting it online again. I’ve been through one too many “fixed it please try and let me know” loops in my life thanks.

1 Like

Confirming these playbooks still do not function.