-
Notifications
You must be signed in to change notification settings - Fork 6.3k
Closed
Labels
staleIssues that haven't received updatesIssues that haven't received updates
Description
Is your feature request related to a problem? Please describe.
Say you want to use CustomDiffusion for some layers, and LoRA for some others.
You want to pass a {'scale': 0.5} to LoRA.
Then the code goes:
TypeError: __call__() got an unexpected keyword argument 'scale'
Because CustomDiffusion has no idea what this parameter will do.
Describe the solution you'd like
- The easiest solution is to drop excess kwargs for implemented attention processors. The downside is that silent bugs may come up.
- Perhaps implementing a flag to indicate whether excess kwargs are expected or not. Downside of this fix is that it looks a bit too ad-hoc.
- Add support for attn kwargs that also specify the layers or attn-proc types affected. The downside is a bit complicated design and also is a lot of work, possibly need to modify every pipeline.
Additional context
I see a similar issue in this comment: #1639 (comment)
But it did not get enough attention.
isidentical
Metadata
Metadata
Assignees
Labels
staleIssues that haven't received updatesIssues that haven't received updates