-
Notifications
You must be signed in to change notification settings - Fork 6.6k
Fix path generation in IP Adapter #6564
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
yiyixuxu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks!
| image_encoder = CLIPVisionModelWithProjection.from_pretrained( | ||
| pretrained_model_name_or_path_or_dict, | ||
| subfolder=os.path.join(subfolder, "image_encoder"), | ||
| subfolder=Path(subfolder, "image_encoder").as_posix(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems to be a good idea to follow this throughout in the library to better support Windows.
What say @patrickvonplaten @yiyixuxu?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Opened a tracker here: #6585.
sayakpaul
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very nice PR!
| if all(proc.__class__ in ADDED_KV_ATTENTION_PROCESSORS for proc in self.attn_processors.values()): | ||
| processor = AttnAddedKVProcessor() | ||
| processor = ( | ||
| AttnAddedKVProcessor2_0() if hasattr(F, "scaled_dot_product_attention") else AttnAddedKVProcessor() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm I still understand AttnAddedKVProcessor and AttnProcessor really as the default attn processor. This is a bit of a breaking change. Could we maybe move this change out of the PR for now? cc @yiyixuxu @sayakpaul and @DN6 as well
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That should be the case indeed. No optimized things as the default.
patrickvonplaten
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @fabiorigano,
Super nice fix by using posix()! Could we for now maybe try to focus only on this change and not also mix a change of set_default_attn_processor in this PR ?
|
Removed changes on |
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
sayakpaul
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Super work here.
|
@sayakpaul just to be sure about the purpose of this fix: os.path.join seem to be working in all the other instances in the library, while it doesn't work when used in .from_pretrained(os.path.join()). I don't know if it worth it to update all instances in the library |
|
That is weird. But it's good to follow a convention across the board I think. |
* Fix path generation on Windows * Update set_default_attn_processors * Use pathlib * Fix quality * Fix copy * Revert changes in set_default_attn_processors --------- Co-authored-by: Sayak Paul <[email protected]>
What does this PR do?
Fixes #6561
and #6192 (comment)@patrickvonplaten @sayakpaul