Hello! This is the first announcement about new DGX Spark playbooks. We are working on many more, stay tuned for announcements!
Today we launched a new playbook for DGX Spark:
- SGLang Inference Server: Installing and using SGLang to serve DeepSeek-V2-Lite model on DGX Spark
In case you haven’t checked the playbook site recently, we launched two new playbooks in late October:
- CUDA-X Data Science: Install and use cuML and cuDF to accelerate UMAP, HDBSCAN, pandas and more
- Vibe Coding in VSCode: Use DGX Spark as a local or remote Vibe Coding assistant with Ollama and Continue
Also, we recently removed the Vision-Language Model Fine-tuning playbook to do some rework. We will launch and announce this playbook again at a later date.
If you have any feedback about any of these playbooks, please continue posting in the forum or email spark-playbook-feedback@nvidia.com.
Hope you all are having lots of fun developing on DGX Spark!