Skip to content

Conversation

charlotte12l
Copy link
Contributor

@charlotte12l charlotte12l commented Sep 4, 2025

Purpose

In RLHF use cases, internal customized trainer and checkpoint may be used. In order to support these customization, we also need to inject custom vLLM config parser.
This PR makes it more extensible to allow plugin registration for config parsers.

See details in #23009. This PR aims to support
"Custom configuration plugin system" mentioned in proposal.

Note: this is the first step to split out configuration logics, we will continue work on decouple customized config parser with hf

Example:

from transformers import PretrainedConfig

from vllm.transformers_utils.config import (get_config_parser,
                                            register_config_parser)
from vllm.transformers_utils.config_parser_base import ConfigParserBase


@register_config_parser("custom_config_parser")
class CustomConfigParser(ConfigParserBase):

    def parse(self,
              model: Union[str, Path],
              trust_remote_code: bool,
              revision: Optional[str] = None,
              code_revision: Optional[str] = None,
              **kwargs) -> tuple[dict, PretrainedConfig]:
        raise NotImplementedError


type(get_config_parser("custom_config_parser"))

<class 'CustomConfigParser'>

Test Plan

python -m pytest tests/transformers_utils/test_config_parser_registry.py

Test Result

passed


Essential Elements of an Effective PR Description Checklist
  • The purpose of the PR, such as "Fix some issue (link existing issues this PR will resolve)".
  • The test plan, such as providing test command.
  • The test results, such as pasting the results comparison before and after, or e2e results
  • (Optional) The necessary documentation update, such as updating supported_models.md and examples for a new model.
  • (Optional) Release notes update. If your change is user facing, please update the release notes draft in the Google Doc.

Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces an excellent plugin system for configuration parsers, enhancing vLLM's extensibility. The refactoring of existing logic into distinct parser classes is clean and well-executed. My feedback primarily addresses inconsistencies in terminology within the new API, particularly in logging and error messages. Correcting these instances, where 'model loader' appears to be a remnant from copy-pasting and should be 'config parser', will significantly improve the developer experience for this new feature.

charlotte12l and others added 4 commits September 4, 2025 13:43
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: Xingyu Liu <[email protected]>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: Xingyu Liu <[email protected]>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: Xingyu Liu <[email protected]>
@22quinn 22quinn added ready ONLY add when PR is ready to merge/full CI is needed rl Related to RL workflows labels Sep 5, 2025
Copy link
Collaborator

@22quinn 22quinn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good as a first step

@22quinn 22quinn enabled auto-merge (squash) September 7, 2025 04:11
@22quinn 22quinn disabled auto-merge September 7, 2025 20:13
Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
@LysandreJik
Copy link

Thanks for the PR! Let us know if there's anything we can do on HF's side to make this work simpler; as it turns out to be an issue, we could simplify the configuration serialization on our side so that a parser isn't required.

Let us know if useful and we'd be happy to help.

@charlotte12l
Copy link
Contributor Author

Thanks for the PR! Let us know if there's anything we can do on HF's side to make this work simpler; as it turns out to be an issue, we could simplify the configuration serialization on our side so that a parser isn't required.

Let us know if useful and we'd be happy to help.

@LysandreJik Thank you! I think a parser can't be avoided if a model is not using HF format?

@facebook-github-bot
Copy link

@charlotte12l has imported this pull request. If you are a Meta employee, you can view this in D82033854.

@@ -0,0 +1,37 @@
# SPDX-License-Identifier: Apache-2.0
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add this directory to "Async Engine, Inputs, Utils, Worker Test" so it's actually being run in CI

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! Added in #24615

@vllm-bot vllm-bot merged commit 9fb74c2 into vllm-project:main Sep 10, 2025
35 of 38 checks passed
skyloevil pushed a commit to skyloevil/vllm that referenced this pull request Sep 13, 2025
Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
FeiDaLI pushed a commit to FeiDaLI/vllm that referenced this pull request Sep 25, 2025
Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
xuebwang-amd pushed a commit to xuebwang-amd/vllm that referenced this pull request Oct 10, 2025
Signed-off-by: Xingyu Liu <[email protected]>
Signed-off-by: Xingyu Liu <[email protected]>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Signed-off-by: xuebwang-amd <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed rl Related to RL workflows

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants