-
Notifications
You must be signed in to change notification settings - Fork 1.8k
thread safety: context vars for batching context #3705
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
fselmo
merged 10 commits into
ethereum:main
from
fselmo:context-vars-for-batching-context
May 22, 2025
Merged
thread safety: context vars for batching context #3705
fselmo
merged 10 commits into
ethereum:main
from
fselmo:context-vars-for-batching-context
May 22, 2025
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- Give some thread safety to batch request context by using ``ContextVar`` instead of setting a global state. This actually simplifies the logic quite a bit since we don't need to manage the batch context state at all. - Removes ``_is_batching`` from provider classes.
cde06fb to
7e20530
Compare
fselmo
added a commit
to fselmo/web3.py
that referenced
this pull request
May 21, 2025
fselmo
added a commit
to fselmo/web3.py
that referenced
this pull request
May 21, 2025
c3cf179 to
69ee30c
Compare
fselmo
added a commit
to fselmo/web3.py
that referenced
this pull request
May 21, 2025
69ee30c to
bd7e3c7
Compare
fselmo
added a commit
to fselmo/web3.py
that referenced
this pull request
May 21, 2025
bd7e3c7 to
d5c9ec7
Compare
fselmo
added a commit
to fselmo/web3.py
that referenced
this pull request
May 21, 2025
d5c9ec7 to
179a210
Compare
pacrob
reviewed
May 22, 2025
- Remove sets from parametrization since they can be in random order. Either use sorted or change to tuples / lists. - run with ``-n auto`` and set ``--maxprocesses=15`` for core test runs.
- Separate the batch send and batch receive functions for persistent connection providers in order to deterministically cache the request information and be able to retrieve it without any request id guesswork. - Remove the idea of a batching request counter / id 🎉 - Correct typing expectations for ``BatchRequestInformation``
- turn off concurrent batch requests test for LegacyWebsocketProvide - turn off batching test for EthereumTesterProvider
- Asynchronous code making use of a threading ``id`` for the cache key doesn't make as much sense as using a loop ``id``. Running the core tests with ``pytest-xdist`` runners actually seems to have identified a bug in this code that was previously well hidden. Replace the threading id with loop id when generating unique cache keys in the http session manager.
1542d9d to
3a05c6a
Compare
fselmo
commented
May 22, 2025
kclowes
approved these changes
May 22, 2025
Collaborator
kclowes
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! I left one nit, feel free to take or leave!
reedsa
approved these changes
May 22, 2025
Contributor
reedsa
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
- Now that we separate send and receive for batching, we no longer have to create a workaround for response formatters to contract calls. We pass all response formatters together as we do with regular contract calls, to be applited to the response as it comes back. - from comment on PR ethereum#3705: move skip marker to decorator
29e517c to
1289343
Compare
fselmo
added a commit
that referenced
this pull request
May 22, 2025
simone1999
added a commit
to IceCreamSwapCom/Web3Python
that referenced
this pull request
May 24, 2025
…reum/web3.py#3705 Let's hope it works :D
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
What was wrong?
Closes #3613
How was it fixed?
ContextVarrather than global flag to toggle batching context. This should better isolate the batching context to its own thread-safe context.PersistentConnectionProviderbatching requests where we hadn't yet separated the send and receive functions to deterministically match responses with their requests byid. Fixed that in this PR as well.🎉 🎉 Bonus 🎉 🎉
I was able to switch our core tests to run via
pytest-xdistinsanely fast by sorting anysetparametrized params or usingtuples /lists instead, where appropriate. I am kind of shocked we hadn't gone this route before but this is a pretty significant update.Todo:
Cute Animal Picture