Sage Attention in ComfyUI

I installed ComfyUI using the Playbook and it has worked well so far, but I am having difficulty getting sage-attention to install. I’m using the command (in the ComfyUI venv):

pip install git+https://github.com/thu-ml/SageAttention

but this fails complaining it can’t find torch. It’s there, version is 2.8.0+cu129. CUDA is 13.0.

Anyone have the magic incantation to get sage-attention to install and work on DGX Spark?

Can you share more details on the error you get? A log would be appreciated

Here’s the output:

pip install git+https://github.com/thu-ml/SageAttention
Collecting git+https://github.com/thu-ml/SageAttention
Cloning GitHub - thu-ml/SageAttention: [ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across language, image, and video models. to /tmp/pip-req-build-a6pechf0
Running command git clone --filter=blob:none --quiet GitHub - thu-ml/SageAttention: [ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across language, image, and video models. /tmp/pip-req-build-a6pechf0
Resolved GitHub - thu-ml/SageAttention: [ICLR2025, ICML2025, NeurIPS2025 Spotlight] Quantized Attention achieves speedup of 2-5x compared to FlashAttention, without losing end-to-end metrics across language, image, and video models. to commit 0f9da83e6038f8330c195cc4bda7f9008a42f679
Installing build dependencies … done
Getting requirements to build wheel … error
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [18 lines of output]
Traceback (most recent call last):
File “/home/jgrasty/ai/comfyui-env/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py”, line 353, in
main()
File “/home/jgrasty/ai/comfyui-env/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py”, line 335, in main
json_out[‘return_val’] = hook(**hook_input[‘kwargs’])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/home/jgrasty/ai/comfyui-env/lib/python3.12/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py”, line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File “/tmp/pip-build-env-6fw9guvs/overlay/lib/python3.12/site-packages/setuptools/build_meta.py”, line 332, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “/tmp/pip-build-env-6fw9guvs/overlay/lib/python3.12/site-packages/setuptools/build_meta.py”, line 302, in _get_build_requires
self.run_setup()
File “/tmp/pip-build-env-6fw9guvs/overlay/lib/python3.12/site-packages/setuptools/build_meta.py”, line 318, in run_setup
exec(code, locals())
File “”, line 36, in
ModuleNotFoundError: No module named ‘torch’
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

You can confirm if torch is installed by running these commands inside the virtualenv
pip show torch
python -c "import torch; print(torch.__version__)"

pip show torch
python -c “import torch; print(torch.version)”
Name: torch
Version: 2.8.0+cu129
Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration
Home-page: https://pytorch.org/
Author: PyTorch Team
Author-email: packages@pytorch.org
License: BSD-3-Clause
Location: /home/jgrasty/ai/comfyui-env/lib/python3.12/site-packages
Requires: filelock, fsspec, jinja2, networkx, setuptools, sympy, triton, typing-extensions
Required-by: accelerate, clip-interrogator, facexlib, fairscale, kornia, open_clip_torch, peft, pixeloe, pytorch-lightning, SAM-2, spandrel, timm, torchaudio, torchmetrics, torchscale, torchsde, torchvision, transparent-background, ultralytics, ultralytics-thop
2.8.0+cu129

Seems to be an issue with Sage Attention so I recommend getting help with that team

Thanks, I’ll do that.

It appears sage-attention is not yet compatible with CUDA 13.0 yet. So, I’ll check back with the project in a week or so.

Would using CUDA 12.9 ok 12.8 work? I am also stuck here. I have tried 12.9 and 12.8 and cannot get Sage Attention to work with DGX Spark. But, I probably lack the technical skills to know how to get it to work.

I, too, lack the skills to figure out the magic incantation to get all the sage-attention pieces to fit.