Skip to content

Conversation

@jjyao
Copy link
Collaborator

@jjyao jjyao commented Oct 27, 2025

Description

This test uses 65 nodes but only run for < 10 minutes so the cost is small.

Related issues

Link related issues: "Fixes #1234", "Closes #1234", or "Related to #1234".

Additional information

Optional: Add implementation details, API changes, usage examples, screenshots, etc.

@jjyao jjyao added the go add ONLY when ready to merge, run all tests label Oct 27, 2025
Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request changes the many_actors release test frequency to nightly. This is a simple configuration change. My main feedback is regarding a potential inefficiency: the test's timeout is set to 60 minutes, while the PR description claims a runtime of less than 10 minutes. I recommend adjusting this timeout in a follow-up to better reflect the actual runtime, which would save resources and provide faster feedback on failures.

working_dir: benchmarks

frequency: nightly-3x
frequency: nightly
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The change to a nightly frequency is fine. However, there's a significant mismatch between the test's configured timeout and its reported runtime. The PR description states the test runs in under 10 minutes, but the timeout is set to 3600 seconds (60 minutes). If the test is indeed this fast, please consider lowering the timeout in a follow-up PR (e.g., to 1200s). A tighter timeout would prevent the 65-node cluster from being held unnecessarily in case of a hang, saving significant resources and providing quicker failure alerts.

@ray-gardener ray-gardener bot added core Issues that should be addressed in Ray Core release-test release test labels Oct 27, 2025
@edoakes
Copy link
Collaborator

edoakes commented Oct 27, 2025

what's the motivation? do you think it's a particularly high information test?

@jjyao
Copy link
Collaborator Author

jjyao commented Oct 27, 2025

This is the only test we have now for actor creation throughput.

@edoakes
Copy link
Collaborator

edoakes commented Oct 27, 2025

This is the only test we have now for actor creation throughput.

Have we seen frequent regressions or failures in the test? If not I'd suggest just leaving it at 3x per week.

@github-actions
Copy link

This pull request has been automatically marked as stale because it has not had
any activity for 14 days. It will be closed in another 14 days if no further activity occurs.
Thank you for your contributions.

You can always ask for help on our discussion forum or Ray's public slack channel.

If you'd like to keep this open, just leave any comment, and the stale label will be removed.

@github-actions github-actions bot added the stale The issue is stale. It will be closed within 7 days unless there are further conversation label Nov 11, 2025
@github-actions
Copy link

This pull request has been automatically closed because there has been no more activity in the 14 days
since being marked stale.

Please feel free to reopen or open a new pull request if you'd still like this to be addressed.

Again, you can always ask for help on our discussion forum or Ray's public slack channel.

Thanks again for your contribution!

@github-actions github-actions bot closed this Nov 25, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

core Issues that should be addressed in Ray Core go add ONLY when ready to merge, run all tests release-test release test stale The issue is stale. It will be closed within 7 days unless there are further conversation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Ray fails to serialize self-reference objects

3 participants