Skip to content

Commit 67948b8

Browse files
vanbasten231994
authored andcommitted
Fix lora tests failure in TPU CI due to the removal of LoRA bias (vllm-project#26723)
Signed-off-by: Xiongfei Wei <[email protected]> Signed-off-by: 1994 <[email protected]>
1 parent f21b3e4 commit 67948b8

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

vllm/v1/worker/tpu_model_runner.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2128,12 +2128,11 @@ def _tpu_set_lora(
21282128
lora_a: torch.Tensor,
21292129
lora_b: torch.Tensor,
21302130
embeddings_tensor: torch.Tensor | None,
2131-
bias: torch.Tensor | None = None,
21322131
):
21332132
# TODO: The integer index leads to a recompilation, but converting it
21342133
# to a tensor doesn't seem to work anymore. This might be fixed with a
21352134
# later release of torch_xla.
2136-
self._original_set_lora(index, lora_a, lora_b, embeddings_tensor, bias)
2135+
self._original_set_lora(index, lora_a, lora_b, embeddings_tensor)
21372136
torch_xla.sync(wait=False)
21382137

21392138
def _tpu_reset_lora(self, index: int):

0 commit comments

Comments
 (0)