Skip to content

Commit c8fdfe4

Browse files
authored
Correct Transformer2DModel.forward docstring (#3074)
⚙️chore(transformer_2d) update function signature for encoder_hidden_states
1 parent bba1c1d commit c8fdfe4

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/diffusers/models/transformer_2d.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -225,7 +225,7 @@ def forward(
225225
hidden_states ( When discrete, `torch.LongTensor` of shape `(batch size, num latent pixels)`.
226226
When continuous, `torch.FloatTensor` of shape `(batch size, channel, height, width)`): Input
227227
hidden_states
228-
encoder_hidden_states ( `torch.LongTensor` of shape `(batch size, encoder_hidden_states dim)`, *optional*):
228+
encoder_hidden_states ( `torch.FloatTensor` of shape `(batch size, sequence len, embed dims)`, *optional*):
229229
Conditional embeddings for cross attention layer. If not given, cross-attention defaults to
230230
self-attention.
231231
timestep ( `torch.long`, *optional*):

0 commit comments

Comments
 (0)