Skip to content

Commit 072a5dd

Browse files
committed
AttentionProcessor.group_norm num_channels should be query_dim
The group_norm on the attention processor should really norm the number of channels in the query _not_ the inner dim. This wasn't caught before because the group_norm is only used by the added kv attention processors and the added kv attention processors are only used by the karlo models which are configured such that the inner dim is the same as the query dim.
1 parent 67c3518 commit 072a5dd

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/attention_processor.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ def __init__(
8181
self.added_kv_proj_dim = added_kv_proj_dim
8282

8383
if norm_num_groups is not None:
84-
self.group_norm = nn.GroupNorm(num_channels=inner_dim, num_groups=norm_num_groups, eps=1e-5, affine=True)
84+
self.group_norm = nn.GroupNorm(num_channels=query_dim, num_groups=norm_num_groups, eps=1e-5, affine=True)
8585
else:
8686
self.group_norm = None
8787

0 commit comments

Comments
 (0)