Skip to content

[Attention processor] Create SDPA versions of attention processors #3464

@sayakpaul

Description

@sayakpaul

diffusers currently supports the following PT 2.0 variant of attention processors

  • AttnProcessor => AttnProcessor2_0
  • AttnAddedKVProcessor => AttnAddedKVProcessor2_0

The following are not supported:

  • SlicedAttnProcessor
  • SlicedAttnAddedKVProcessor
  • LoRAAttnProcessor
  • CustomDiffusionAttnProcessor

We should add SDPA versions of the above processors. This essentially eliminates the need to use xformers.

Metadata

Metadata

Assignees

Labels

staleIssues that haven't received updates

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions