Skip to content

Commit a7ea35a

Browse files
authored
[Bugfix] Remove num_tokens_across_dp (#14302)
Signed-off-by: Tyler Michael Smith <[email protected]>
1 parent 1e3e76b commit a7ea35a

File tree

1 file changed

+1
-2
lines changed

1 file changed

+1
-2
lines changed

vllm/forward_context.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,6 @@
2727

2828
@dataclass
2929
class DPMetadata:
30-
num_tokens_across_dp: list[int]
3130
cu_tokens_across_dp_cpu: torch.Tensor
3231

3332

@@ -89,7 +88,7 @@ def set_forward_context(attn_metadata: Any,
8988
from vllm.distributed.parallel_state import get_dp_group
9089
dist.all_reduce(num_tokens_tensor, group=get_dp_group().cpu_group)
9190
cu_tokens_across_dp_cpu = torch.cumsum(num_tokens_tensor, dim=0)
92-
dp_metadata = DPMetadata(num_tokens_across_dp, cu_tokens_across_dp_cpu)
91+
dp_metadata = DPMetadata(cu_tokens_across_dp_cpu)
9392

9493
global _forward_context
9594
prev_context = _forward_context

0 commit comments

Comments
 (0)