-
Notifications
You must be signed in to change notification settings - Fork 6.1k
Fix monkey-patch for text_encoder LoRA #3490
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
fb708fb
6e8f3ab
8511755
81915f4
88db546
1da772b
8c0926c
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -946,11 +946,15 @@ def _modify_text_encoder(self, attn_processors: Dict[str, LoRAAttnProcessor]): | |
lora_layer = getattr(attn_processors[name], self._get_lora_layer_attribute(name)) | ||
old_forward = module.forward | ||
|
||
def new_forward(x): | ||
return old_forward(x) + lora_layer(x) | ||
# create a new scope that locks in the old_forward, lora_layer value for each new_forward function | ||
def make_new_forward(old_forward, lora_layer): | ||
def new_forward(x): | ||
return old_forward(x) + lora_layer(x) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Small comment: if you load lora with e.g. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks for your suggestion! Indeed, it seems that might be the case. I wonder if it might be better to create a mechanism to remove the moneky-patch. @sayakpaul WDYT? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Best is indeed to do something else than monkey-patching, but a flag like There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We're currently discussing this internally, and will keep y'all posted. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Meanwhile,
@rvorias could you elaborate what you mean here? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Have This is useful in contexts where you want to load arbitrary lora weights on the fly in a long-running SD inference engine. If you add the flag+condition you can still have the new lora weights to load, but you don't override the forward again and again. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Would you be willing to open a PR for this? We're more than happy to help you with that :-) There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
|
||
return new_forward | ||
|
||
# Monkey-patch. | ||
module.forward = new_forward | ||
module.forward = make_new_forward(old_forward, lora_layer) | ||
|
||
def _get_lora_layer_attribute(self, name: str) -> str: | ||
if "q_proj" in name: | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I'd not actually mind referring users to read this issue comment you made:
#3490 (comment)