Skip to content

【Question】What is the minimum number of GPUs required to train deepseek 671B with GRPO? How about using LoRA? #6219

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
LiuShixing opened this issue Feb 25, 2025 · 1 comment

Comments

@LiuShixing
Copy link

https://company.hpc-ai.com/blog/shocking-release-deepseek-671b-fine-tuning-guide-revealed-unlock-the-upgraded-deepseek-suite-with-one-click-ai-players-ecstatic

The above article only provides the GPU requirements for SFT LoRA. What about GRPO?

@mahaocong90
Copy link

Same question, and then if I set --max_length from 256 to 128, and --batch_size from 24 to 12, does this reduce fine-tuning memory consumption?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants