Skip to content

Commit 91ca929

Browse files
authored
[V1] Fix wrong import path of get_flash_attn_version (#15280)
Signed-off-by: Lehua Ding <[email protected]>
1 parent 84e00ad commit 91ca929

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/v1/attention/backends/mla/common.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -195,8 +195,8 @@
195195
from vllm.attention.backends.abstract import (AttentionBackend, AttentionLayer,
196196
AttentionMetadata,
197197
MLAAttentionImpl)
198-
from vllm.attention.backends.utils import get_flash_attn_version
199198
from vllm.attention.ops.triton_merge_attn_states import merge_attn_states
199+
from vllm.fa_utils import get_flash_attn_version
200200
from vllm.logger import init_logger
201201
from vllm.model_executor.layers.linear import (ColumnParallelLinear,
202202
LinearBase, RowParallelLinear,

0 commit comments

Comments
 (0)