Skip to content

Conversation

timzsu
Copy link
Contributor

@timzsu timzsu commented Apr 22, 2025

This PR fixes the issue of the flashinfer backend complaining "Current VLLM config is not set" during CUDA graph capturing and closes #13207. The fix follows the proposed method in that issue.

Copy link

👋 Hi! Thank you for contributing to the vLLM project.

💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels.

Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can either: Add ready label to the PR or enable auto-merge.

🚀

@timzsu timzsu changed the title Pass in correct VLLM config [BugFix] Pass in correct VLLM config in FlashInfer backend (#13507) Apr 22, 2025
@timzsu timzsu changed the title [BugFix] Pass in correct VLLM config in FlashInfer backend (#13507) [BugFix] Pass in correct VLLM config in FlashInfer backend (#13207) Apr 22, 2025
@timzsu timzsu force-pushed the zsu/fix-flashinfer-config-not-set branch from 949763a to d5593e2 Compare April 22, 2025 09:35
@timzsu timzsu marked this pull request as ready for review April 22, 2025 09:41
Copy link
Member

@DarkLight1337 DarkLight1337 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for fixing!

@DarkLight1337 DarkLight1337 enabled auto-merge (squash) April 22, 2025 09:47
@github-actions github-actions bot added the ready ONLY add when PR is ready to merge/full CI is needed label Apr 22, 2025
@timzsu
Copy link
Contributor Author

timzsu commented Apr 22, 2025

Hi @DarkLight1337, I checked the failed tests, and the failure doesn't seem related to my changes. Do you know what should I do to resolve them?

@DarkLight1337
Copy link
Member

I'll force-merge it

@vllm-bot vllm-bot merged commit f961d7f into vllm-project:main Apr 22, 2025
46 of 51 checks passed
frieda-huang pushed a commit to frieda-huang/vllm that referenced this pull request Apr 23, 2025
jikunshang pushed a commit to jikunshang/vllm that referenced this pull request Apr 29, 2025
lk-chen pushed a commit to lk-chen/vllm that referenced this pull request Apr 29, 2025
adobrzyn pushed a commit to HabanaAI/vllm-fork that referenced this pull request Apr 30, 2025
RichardoMrMu pushed a commit to RichardoMrMu/vllm that referenced this pull request May 12, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ready ONLY add when PR is ready to merge/full CI is needed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: VLLM config not set when using Flash Infer backend.

3 participants