Skip to content

Conversation

noemotiovon
Copy link
Contributor

What this PR does / why we need it?

This PR resolves the issue with inference on the Ray backend. For more details, see here.

Does this PR introduce any user-facing change?

no.

How was this patch tested?

Validation was performed based on v0.7.3, and the specific validation script can be found here.

@@ -0,0 +1,20 @@
import vllm
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

move this file to patch folder

Signed-off-by: Chenguang Li <[email protected]>
@noemotiovon
Copy link
Contributor Author

@wangxiyuan,
Thank you for your suggestion! Changes have been made based on your review comments. Could you help merge it? ☺️

@wangxiyuan
Copy link
Collaborator

CI failed

@noemotiovon
Copy link
Contributor Author

The CI has passed successfully now. The scenario where the user hadn't installed Ray was previously overlooked.

@wangxiyuan wangxiyuan merged commit 8a62c1f into vllm-project:v0.7.3-dev Feb 27, 2025
10 checks passed
@wangxiyuan
Copy link
Collaborator

I'm fine with this change on 0.7.3. While for main branch, let's consider a generic way.

@Yikun Yikun changed the title [Build] Make CI work on 0.7.3-dev (#140) [Build] Monkey patch RayWorkerWrapper to make CI work on 0.7.3-dev (#140) Mar 5, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants