-
Notifications
You must be signed in to change notification settings - Fork 3.6k
Closed
Description
Proposed refactoring or deprecation
Motivation
We have added the strategy argument for Trainer in this PR.
The next immediate refactors required are:
- Tests (1/n) tests: Use strategy flag instead of accelerator for training strategies #9931
- Code/warnings updates Update accelerator connector messages after the addition of strategy #9937
- Docs: general usage Update strategy flag in docs #10000
- Docs: Trainer flag documentation (accelerator, strategy) Update Trainer flag docs for strategy #10042
- tutorials
Post the above refactors, we will be going through the below internal updates:
- Remove DistributedType.DDP_CPU. This should not be a valid
strategyvalue. - Remove DistributedType.TPU_SPAWN
- Rename DistributedType to StrategyType Deprecate
DistributedTypein favor ofStrategyType#10505 - Rename DeviceType to AcceleratorType Deprecate
DeviceTypein favor of_AcceleratorType#10503 - Remove user-facing distributed_backend references in code and error messages. Remove deprecated
distributed_backendfromTrainer#10017 - Rename the inernal
self.distributed_backendreferences. - Rename AcceleratorConnector._device_type to _accelerator_type
- Rename AcceleratorConnector._distrib_type to _strategy_type
- Define all non-supported accelerator-strategy combinations. Is this already done?
- Rename TrainingTypePlugin and its references to StrategyPlugin Introduce
Strategyin favour ofTrainingTypePlugin#10548 - Remove "test_accelerator_x" tests in
tests/accelerators/test_accelerator_connector.pythat test for the deprecated usage ofaccelerator="strategy_name"
If you enjoy Lightning, check out our other projects! ⚡
-
Metrics: Machine learning metrics for distributed, scalable PyTorch applications.
-
Flash: The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, finetuning and solving problems with deep learning
-
Bolts: Pretrained SOTA Deep Learning models, callbacks and more for research and production with PyTorch Lightning and PyTorch
-
Lightning Transformers: Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra.
daniellepintz and akihironitta