Skip to content

Conversation

@DN6
Copy link
Collaborator

@DN6 DN6 commented Sep 2, 2024

What does this PR do?

  • Add attribute to models uses_custom_attn_processor to skip test_set_attn_processor_for_determinism tests.
  • Add a slow tag to a LoRA test that needs a checkpoint.

Fixes # (issue)

Before submitting

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@DN6 DN6 changed the title [CI] More Fast Test Fixes [CI] More Fast GPU Test Fixes Sep 2, 2024
@DN6 DN6 requested a review from sayakpaul September 3, 2024 05:40
Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a single comment. Not merge blocking.

class DiTTransformer2DModelTests(ModelTesterMixin, unittest.TestCase):
model_class = DiTTransformer2DModel
main_input_name = "hidden_states"
uses_custom_attn_processor = False
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could just default it to False in ModelTesterMixin , no?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. I'll update.

@DN6 DN6 merged commit f6f16a0 into main Sep 3, 2024
sayakpaul pushed a commit that referenced this pull request Dec 23, 2024
* update

* update

* update

* update
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants