Skip to content

Conversation

@timh
Copy link
Contributor

@timh timh commented Dec 6, 2022

No description provided.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Dec 6, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Contributor

@patil-suraj patil-suraj left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for fixing this, just left one comment !

Comment on lines 679 to 680
unet=accelerator.unwrap_model(unet, True),
text_encoder=accelerator.unwrap_model(text_encoder, True),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great catch!

The keep_fp32_wrapper argument was added only recently to accelerate.unwrap_model , so here we should first check if the method accepts that argument to keep compatibility with previous versions of accelerate.

It could be done with

accepts_keep_fp32_wrapper = "keep_fp32_wrapper" in set(inspect.signature(accelerator.unwrap_model().parameters.keys())

@timh
Copy link
Contributor Author

timh commented Dec 8, 2022

Changes made as you suggested. ICYMI, there's also a different PR with an alternative approach for fixing this issue: #1575

… checkpoint to avoid crash when running fp16
@timh
Copy link
Contributor Author

timh commented Dec 8, 2022

closing this one in favor of #1618, to make the merge commit nicer

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants