Skip to content

Fix type hints of tuner/batch_size_scaling.py #13518

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

ar90n
Copy link
Contributor

@ar90n ar90n commented Jul 3, 2022

What does this PR do?

If wrong parameters are given as arguments, _adjust_batch_size cannot adjust batch size correctly. So this PR add some codes to raise ValueErrors in that cases.

Fixes part of #13445

Does your PR introduce any breaking changes? If yes, please list them.

Before submitting

  • Was this discussed/approved via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

Did you have fun?

Make sure you had fun coding 🙃

@ar90n ar90n changed the title Fix type hints Fix type hints of tuner/batch_size_scaling.py Jul 3, 2022
@ar90n ar90n force-pushed the fix-type-hints-tuner-batch_size_scaling.py branch from aac9127 to 3f1d70c Compare July 3, 2022 13:04
@ar90n ar90n force-pushed the fix-type-hints-tuner-batch_size_scaling.py branch from ec01aea to ab9bcba Compare July 3, 2022 13:22
@awaelchli awaelchli added the tuner label Jul 4, 2022
@awaelchli awaelchli added this to the pl:1.7 milestone Jul 4, 2022
ar90n and others added 5 commits July 4, 2022 23:44
To fix type check issue, add None check explicitly.
This early return dons't change the behabior of _is_valid_batch_size.
Because has_len_all_ranks always return True if dataloader is None.
@mergify mergify bot added ready PRs ready to be merged and removed has conflicts ready PRs ready to be merged labels Sep 22, 2022
@Borda Borda enabled auto-merge (squash) September 22, 2022 10:26
@mergify mergify bot added has conflicts ready PRs ready to be merged and removed ready PRs ready to be merged has conflicts labels Sep 23, 2022
@otaj
Copy link
Contributor

otaj commented Sep 27, 2022

Hi, @ar90n, #11089 just got merged 🚀

auto-merge was automatically disabled September 28, 2022 15:11

Head branch was pushed to by a user without write access

@mergify mergify bot added ready PRs ready to be merged and removed has conflicts ready PRs ready to be merged labels Sep 28, 2022
@otaj otaj enabled auto-merge (squash) September 29, 2022 09:11
@otaj otaj merged commit d377d0e into Lightning-AI:master Sep 29, 2022
@@ -31,6 +31,8 @@
class BatchSizeFinder(Callback):
SUPPORTED_MODES = ("power", "binsearch")

optimal_batch_size: Optional[int]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this annotated in the class definition instead of in the __init__?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@carmocca
This is my mistake. And my modification wasn't in time.
In the head of my fork branch, this annotation is moved into init.
May I create a new PR to move this annotation into init ?

ar90n@7076c1b

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, please. Thank you 💜

@ar90n ar90n mentioned this pull request Sep 29, 2022
12 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
code quality community This PR is from the community pl Generic label for PyTorch Lightning package ready PRs ready to be merged tuner
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants