Skip to content

Commit 3ea7ecd

Browse files
authored
TP + EP + MoE disclaimer (#42519)
1 parent cfb43ae commit 3ea7ecd

File tree

1 file changed

+10
-0
lines changed

1 file changed

+10
-0
lines changed

MIGRATION_GUIDE_V5.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -317,8 +317,18 @@ labels = tokenizer(text_target=tgt_texts, ...)
317317
## Disclaimers for the RC0
318318

319319
### PEFT + MoE:
320+
320321
Because we are switching from the naive MOE (`nn.ModuleList` for experts) we currently have an issue with MoEs that have adapters. For more details see https://github.com/huggingface/transformers/issues/42491#issuecomment-3591485649.
321322

323+
_We aim for this to be fixed and released in a following release candidate in the week that follows RC0._
324+
325+
### Tensor parallel and Expert parallel + MoE
326+
327+
We are streamlining the MoE support with vLLM; while this is being implemented, tensor parallelism and expert parallelism aren't working as expected.
328+
This is known and actively being worked on.
329+
330+
_We aim for this to be fixed and released in a following release candidate in the week that follows RC0._
331+
322332
### Custom pretrained models:
323333
For anyone inheriting from a `transformers` `PreTrainedModel`, the weights are automatically initialized with the common scheme:
324334
```python

0 commit comments

Comments
 (0)