Skip to content

Conversation

DarkLight1337
Copy link
Member

@DarkLight1337 DarkLight1337 commented Sep 3, 2024

This PR follows up to #5049 by adding support for multi-modal inputs in LLM.chat. To do so, I implemented a sync version of the chat parser.

cc @petersalas since I further refactored your parsing logic

Also fixes #8117

Copy link

github-actions bot commented Sep 3, 2024

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which consists a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of default ones by unblocking the steps in your fast-check build on Buildkite UI.

Once the PR is approved and ready to go, please make sure to run full CI as it is required to merge (or just use auto-merge).

To run full CI, you can do one of these:

  • Comment /ready on the PR
  • Add ready label to the PR
  • Enable auto-merge.

🚀

Copy link
Contributor

@petersalas petersalas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Collaborator

@simon-mo simon-mo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Stamp on behalf of Roger

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) September 4, 2024 03:41
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Sep 4, 2024
@DarkLight1337 DarkLight1337 merged commit 855c262 into vllm-project:main Sep 4, 2024
54 of 55 checks passed
@DarkLight1337 DarkLight1337 deleted the multimodal-offine-chat branch September 4, 2024 05:48
Alvant pushed a commit to compressa-ai/vllm that referenced this pull request Oct 26, 2024
LeiWang1999 pushed a commit to LeiWang1999/vllm-bitblas that referenced this pull request Mar 26, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Doc]: LLM.chat() docstring incorrectly suggests multiple chats can be generated in one call
3 participants