Skip to content

Conversation

@DN6
Copy link
Collaborator

@DN6 DN6 commented Aug 28, 2024

What does this PR do?

Add a check on set_attention_processor_for_determinism test that allows us to skip this test for models that don't use AttnProcessor or AttnProcessor_2

Skip xformers tests for models/pipelines without an xformers processor.

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@DN6 DN6 requested a review from sayakpaul August 28, 2024 12:21
Comment on lines +35 to +36
# Skip setting testing with default: AttnProcessor
uses_custom_attn_processor = True
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be added to Fal as well no?

batch_params = frozenset(["prompt"])

# there is no xformers processor for Flux
test_xformers_attention = False
Copy link
Member

@sayakpaul sayakpaul Aug 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Let's make sure to not test for these cases wherever applicable. AuraFlow, for example, could be added.

@DN6 DN6 merged commit 007ad0e into main Sep 2, 2024
sayakpaul pushed a commit that referenced this pull request Dec 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants