Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FlashAttention installation error: "CUDA 11.6 and above" requirement issue #1282

Open
21X5122 opened this issue Oct 17, 2024 · 0 comments
Open

Comments

@21X5122
Copy link

21X5122 commented Oct 17, 2024

Hi,

I encountered an error while trying to install flash-attn (version 2.6.3) via pip. The error message indicates that FlashAttention requires CUDA 11.6 or above, but I am using PyTorch with CUDA 11.8 (torch.__version__ = 2.5.0+cu118). The installation fails during metadata generation, and I'm unsure how to proceed. Below is the full error output:

Collecting flash-attn
  Using cached flash_attn-2.6.3.tar.gz (2.6 MB)
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [12 lines of output]
      fatal: not a git repository (or any of the parent directories): .git
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/tmp/pip-install-ur6gdyxq/flash-attn_8aaa3421ed454954b43cbcebfe9bb28e/setup.py", line 160, in <module>
          raise RuntimeError(
      RuntimeError: FlashAttention is only supported on CUDA 11.6 and above.  Note: make sure nvcc has a supported version by running nvcc -V.
      
      
      torch.__version__  = 2.5.0+cu118
      
      
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

I have already checked my CUDA version using nvcc -V, and it shows CUDA 11.8 is installed.

Would you be able to provide any guidance or suggestions on how to resolve this issue? Is there anything else I should configure or check to ensure compatibility with flash-attn?

Thanks in advance for your help!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant