Skip to content

Conversation

@Qubitium
Copy link
Collaborator

@Qubitium Qubitium commented Feb 27, 2025

  • Load LoraConfig.r
  • HF Load and use LoraConfig.target_modules
  • GPTQModel Load and use LoraConfig.rank_patterns
  • HF Load and use LoraConfig.rank_patterns
  • dynamic compat (based on dynamic generate target_modules and rank_patterns
  • Save LoraConfig (r, target_module, rank_patterns)
  • CI tests for dynamic and non-dynamic

Signed-off-by: Qubitium <[email protected]>
@Qubitium Qubitium changed the title save eora to hf format [WIP] save eora to hf format Feb 27, 2025
@Qubitium Qubitium changed the title [WIP] save eora to hf format [WIP] save/load peft lora Feb 27, 2025
@Qubitium Qubitium changed the title [WIP] save/load peft lora save/load peft lora Feb 28, 2025
@Qubitium Qubitium marked this pull request as ready for review February 28, 2025 05:42
ZX-ModelCloud and others added 2 commits February 28, 2025 05:46
Signed-off-by: ZX-ModelCloud <[email protected]>
Signed-off-by: Qubitium <[email protected]>
@Qubitium Qubitium merged commit f7b86a5 into main Mar 1, 2025
4 checks passed
@Qubitium Qubitium deleted the lora-format branch March 4, 2025 12:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants