-
Notifications
You must be signed in to change notification settings - Fork 30.7k
add shared experts for upcoming Granite 4.0 language models #35894
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@ArthurZucker can you merge this? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey! As always for us this will be a new model, with modular
it should be super easy to add however! https://huggingface.co/docs/transformers/en/modular_transformers 🤗
We're adding an additional feature (shared experts) that doesn't break past checkpoints, and is an extension of our own model class. Would every extension entail a new model class? |
Yes 🤗 I am sorry but that is the way we have been handling every single model so far! |
a0fd52a
to
9b2652c
Compare
c09ee2f
to
d81c969
Compare
Not sure how to get the tests to pass , some are not due to the changes I've made. |
Signed-off-by: Sukriti-Sharma4 <[email protected]>
Signed-off-by: Sukriti-Sharma4 <[email protected]>
Signed-off-by: Sukriti-Sharma4 <[email protected]>
Signed-off-by: Sukriti-Sharma4 <[email protected]>
Signed-off-by: Sukriti-Sharma4 <[email protected]>
Signed-off-by: Sukriti-Sharma4 <[email protected]>
Signed-off-by: Sukriti-Sharma4 <[email protected]>
Signed-off-by: Sukriti-Sharma4 <[email protected]>
@ArthurZucker the PR is ready, please review. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Super clean! Super nice!
|
||
|
||
class GraniteMoeSharedForCausalLM(GraniteMoeForCausalLM): | ||
_tied_weights_keys = ["lm_head.weight"] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if the mlp is shared, should it appear here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no, it shouldnt.
shared means its a shared in sense of experts (not across layers)
Thanks for approving, please merge as soon as possible :) |
Signed-off-by: Sukriti-Sharma4 <[email protected]>
I have updated with the main branch , made corresponding changes and all checks have passed :) |
* Modular GraniteMoE with shared Experts. Signed-off-by: Shawn Tan <[email protected]> * Modified * Import order. * Modified for style * Fix space. * Test * Remove extra granitemoe file. * New converted file and tests * Modified __init__ files. * Formatting. * Dummy PT objects * register granitemoe shared model Signed-off-by: Sukriti-Sharma4 <[email protected]> * fix linting of a file Signed-off-by: Sukriti-Sharma4 <[email protected]> * fix import in modeling file Signed-off-by: Sukriti-Sharma4 <[email protected]> * update generated modeling file Signed-off-by: Sukriti-Sharma4 <[email protected]> * add documentation Signed-off-by: Sukriti-Sharma4 <[email protected]> * update docstrings Signed-off-by: Sukriti-Sharma4 <[email protected]> * update generated modeling file Signed-off-by: Sukriti-Sharma4 <[email protected]> * fix docstrings in config class Signed-off-by: Sukriti-Sharma4 <[email protected]> * merge main Signed-off-by: Sukriti-Sharma4 <[email protected]> --------- Signed-off-by: Shawn Tan <[email protected]> Signed-off-by: Sukriti-Sharma4 <[email protected]> Co-authored-by: Shawn Tan <[email protected]> Co-authored-by: Shawn Tan <[email protected]> Co-authored-by: Sukriti-Sharma4 <[email protected]> Co-authored-by: Sukriti Sharma <[email protected]>
…ace#35894) * Modular GraniteMoE with shared Experts. Signed-off-by: Shawn Tan <[email protected]> * Modified * Import order. * Modified for style * Fix space. * Test * Remove extra granitemoe file. * New converted file and tests * Modified __init__ files. * Formatting. * Dummy PT objects * register granitemoe shared model Signed-off-by: Sukriti-Sharma4 <[email protected]> * fix linting of a file Signed-off-by: Sukriti-Sharma4 <[email protected]> * fix import in modeling file Signed-off-by: Sukriti-Sharma4 <[email protected]> * update generated modeling file Signed-off-by: Sukriti-Sharma4 <[email protected]> * add documentation Signed-off-by: Sukriti-Sharma4 <[email protected]> * update docstrings Signed-off-by: Sukriti-Sharma4 <[email protected]> * update generated modeling file Signed-off-by: Sukriti-Sharma4 <[email protected]> * fix docstrings in config class Signed-off-by: Sukriti-Sharma4 <[email protected]> * merge main Signed-off-by: Sukriti-Sharma4 <[email protected]> --------- Signed-off-by: Shawn Tan <[email protected]> Signed-off-by: Sukriti-Sharma4 <[email protected]> Co-authored-by: Shawn Tan <[email protected]> Co-authored-by: Shawn Tan <[email protected]> Co-authored-by: Sukriti-Sharma4 <[email protected]> Co-authored-by: Sukriti Sharma <[email protected]>
Is the public release of Granite-4.0-Tiny-Preview in any way relevant to this PR? (like does it warrant any follow-up work, additional validation/testing/CI, etc.) |
This PR adds support for shared experts in GraniteMoE model class for upcoming Granite 4.0 language models.
@ArthurZucker