Skip to content

Conversation

minminsun
Copy link

No description provided.

@LucasWilkinson
Copy link
Collaborator

@minminsun Sorry for the confusion but we will be making the lwilkinson/fa3-squashed branch the new main branch shortly (just wanted to hold off till after the release to do this to avoid issues that could potentially block the release), could you please make this PR to that branch instead of main?

(we had to do this this vllm-flash-attn had diverged too far from the upstream, apologies for the inconvenience)

@minminsun
Copy link
Author

@minminsun Sorry for the confusion but we will be making the lwilkinson/fa3-squashed branch the new main branch shortly (just wanted to hold off till after the release to do this to avoid issues that could potentially block the release), could you please make this PR to that branch instead of main?

(we had to do this this vllm-flash-attn had diverged too far from the upstream, apologies for the inconvenience)

Hi @LucasWilkinson This PR is a bug fix for sparse_attn. It's OK for me to make the PR to branch lwilkinson/fa3-squashed, but I found that sparse_attn is not in that branch yet. Can we merge thie PR to main branch first, and, after that, cherry-pick the sparse_attn along with this change to that branch?

@LucasWilkinson
Copy link
Collaborator

but I found that sparse_attn is not in that branch yet.

hmmm it should be: https://github.com/vllm-project/flash-attention/blob/lwilkinson/fa3-squashed/csrc/flash_attn/flash_api_sparse.cpp

sorry its just in its own file now to reduce the diff with upstream

@minminsun
Copy link
Author

minminsun commented Feb 6, 2025

but I found that sparse_attn is not in that branch yet.

hmmm it should be: https://github.com/vllm-project/flash-attention/blob/lwilkinson/fa3-squashed/csrc/flash_attn/flash_api_sparse.cpp

sorry its just in its own file now to reduce the diff with upstream

OK, I See. I will file another PR to the new branch. And I think the current main also needs this fix before it gets replaced.

@LucasWilkinson
Copy link
Collaborator

And I think the current main also needs this fix before it gets replaced.

Main is deprecated now, ill try to make sure the renaming happens next week 👍 we were just hesitant since we've been busy with the V1 and DeepSeek pushes so didn't want to break anything during such a crazy time, apologies

@LucasWilkinson
Copy link
Collaborator

#43 (comment)

Apologies for the inconvenience, please update this PR to point to the new main (see comment above)

mikelasby-cohere added a commit to mikelasby-cohere/flash-attention that referenced this pull request Mar 24, 2025
mikelasby-cohere added a commit to mikelasby-cohere/flash-attention that referenced this pull request Mar 24, 2025
mikelasby-cohere added a commit to mikelasby-cohere/flash-attention that referenced this pull request Mar 24, 2025
mikelasby-cohere added a commit to mikelasby-cohere/flash-attention that referenced this pull request Apr 3, 2025
mklasby pushed a commit to mklasby/flash-attention that referenced this pull request Apr 3, 2025
mklasby pushed a commit to mklasby/flash-attention that referenced this pull request Apr 3, 2025
tlrmchlsmth pushed a commit that referenced this pull request Apr 12, 2025
Signed-off-by: Mike Lasby <[email protected]>
Co-authored-by: Mike Lasby <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants