I keep failing to install Flash Attention 3 in the LTX-2 UV environment

copying /tmp/tmpk4223xzz.build-lib/flash_attn_3/_C.abi3.so →
flash_attn_3

  [stderr]
  /home/xiaohong/LTX-2/.venv/lib/python3.12/site-packages/setuptools/dist.py:332:
  InformationOnly: Normalizing '3.0.0.b1' to '3.0.0b1'
    self.metadata.version = self._normalize_version(self.metadata.version)
  /home/xiaohong/LTX-2/.venv/lib/python3.12/site-packages/setuptools/dist.py:759:
  SetuptoolsDeprecationWarning: License classifiers are deprecated.
  !!

  
  ********************************************************************************
          Please consider removing the following classifiers in favor of
  a SPDX license expression:

          License :: OSI Approved :: Apache Software License

          See
  https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license
  for details.
  
  ********************************************************************************

  !!
    self._finalize_license_expression()
  W0114 07:08:01.022000 1591534 torch/utils/cpp_extension.py:531] There
  are no /usr/bin/g++-11 version bounds defined for CUDA version 13.0
  error: could not create 'flash_attn_3/_C.abi3.so': No such file or
  directory

  hint: This usually indicates a problem with the package or the build
  environment.

DEBUG Released lock at /home/xiaohong/LTX-2/.venv/.lock
DEBUG Released lock at /home/xiaohong/.cache/uv/.lock

Can you provide more details so we can attempt to reproduce this issue? What scripts or commands did you run?

1 Like

export CUDA_HOME=/usr/local/cuda
export PATH=$CUDA_HOME/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/lib:$CUDA_HOME/lib:$CUDA_HOME/lib64:$LD_LIBRARY_PATH
export CC=/usr/bin/gcc-11
export CXX=/usr/bin/g+±11
export TORCH_CUDA_ARCH_LIST=“12.1”
MAX_JOBS=12 uv pip install -e . --verbose --no-build-isolation

Hello Hongde,

What version of PyTorch are you using/are you using the PyTorch NGC container?

Hello again,

I also was able to reproduce your error. Before running your commands, I created the directory first.

For me, I was working out of a NGC PyTorch container where my working directory was /workspace/flash-attention/hopper (via the flash-attention repo). I did mkdir -p flash_attn_3and it worked for me!

Please try that fix and let me know if you have any other questions or concerns!

thank you
(LTX-2) root@c0e2a06cb538:/workspace/flash-attention/hopper# pip show torch
Name: torch
Version: 2.10.0a0+b4e4ee81d3.nv25.12
Summary: Tensors and Dynamic neural networks in Python with strong GPU acceleration
Home-page: https://pytorch.org
Author:
Author-email: PyTorch Team packages@pytorch.org
License: BSD-3-Clause
Location: /usr/local/lib/python3.12/dist-packages
Requires: filelock, fsspec, jinja2, networkx, setuptools, sympy, typing-extensions
Required-by: flash_attn, flash_attn_3, lightning-thunder, nvidia-modelopt, nvidia-resiliency-ext, torchdata, torchprofile, torchvision, transformer_engine

Great. That PyTorch version should be sufficient. Were you able to get flash attention 3 installed properly?