Skip to content

Fix for bug #4038 to the teardown_exact method #4055

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 22 commits into from
Closed

Fix for bug #4038 to the teardown_exact method #4055

wants to merge 22 commits into from

Conversation

arobbins805
Copy link

Changes the teardown behavior so that teardown code is called from within the pytest_runtest_teardown hook rather than from within the pytest_runtest_setup hook when a fixture of scope higher than function has been parameterized.

  • Create a new changelog file in the changelog folder, with a name like <ISSUE NUMBER>.<TYPE>.rst. See changelog/README.rst for details.
  • Target the master branch for bug fixes, documentation updates and trivial changes.
  • Target the features branch for new features and removals/deprecations.
  • Include documentation when adding new features.
  • Include new tests or update existing tests when applicable.
  • Add yourself to AUTHORS in alphabetical order;

@codecov
Copy link

codecov bot commented Sep 29, 2018

Codecov Report

Merging #4055 into master will decrease coverage by 0.02%.
The diff coverage is 78.26%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #4055      +/-   ##
==========================================
- Coverage   95.69%   95.67%   -0.03%     
==========================================
  Files         113      113              
  Lines       25178    25222      +44     
  Branches     2498     2516      +18     
==========================================
+ Hits        24095    24131      +36     
- Misses        765      771       +6     
- Partials      318      320       +2
Impacted Files Coverage Δ
src/_pytest/runner.py 94.33% <78.26%> (-2.75%) ⬇️
src/_pytest/fixtures.py 97.49% <0%> (-0.42%) ⬇️
src/_pytest/terminal.py 91.74% <0%> (+0.82%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update aae0286...2f31df1. Read the comment docs.

@coveralls
Copy link

Coverage Status

Coverage decreased (-0.001%) to 93.801% when pulling 5f32786 on arobbins805:master into d8d7f73 on pytest-dev:master.

@arobbins805
Copy link
Author

Hi, could someone please look at my pull-request when you have some time. This change should cause the teardown code to be run during the pytest_runtest_teardown hook, rather than during the pytest_runtest_setup, when fixtures are parameterized. Thanks, Andrew

@nicoddemus
Copy link
Member

Hi @arobbins805,

Thanks for the PR!

This is a quite sensitive part of the pytest's internals, so it will take a awhile for us to review it properly, so please bear with us!

@arobbins805
Copy link
Author

Hi @nicoddemus,

Thanks for your response. I look forward to hearing your analysis of my change.

@arobbins805
Copy link
Author

Hi @nicoddemus,

Is there any update on this PR's status or anything that I can help with?

Thanks, Andrew

@nicoddemus
Copy link
Member

Hi @arobbins805,

Is there any update on this PR's status or anything that I can help with?

Not at the moment, thanks. Sorry about the delay, this is a very sensitive area of the code and we need to reserve a good time to review it thoroughly, as we fear this kind of change tends to introduce a ton of regressions if we are not careful.

@nicoddemus
Copy link
Member

Hi @arobbins805,

Sorry for the delay on this, as I said, it is a very sensitive area of the code so it needs careful consideration.

Currently, fixtures are created/destroyed in a lazy fashion: fixtures are requested during setup. At this time, the cache mechanism is checked to see if the currently cached value already satisfies the fixture request. If it does, that cached value is returned. If not, the fixture function is called, the returned value is cached, and the value is used as the fixture value. At this point, if the cache contained a previously cached value for that fixture, it is torn down (still during setup).

The reasoning behind this PR is solid: instead of being reactive to fixture creation/destruction (only destroy a fixture when we need to replace it with a new value), destroy the fixtures proactively by, during pytest_runtest_teardown, looking ahead and tearing down fixtures that we know will need to be recomputed in the nest setup phase.

Unfortunately, our current cache mechanism is brittle, and probably there are plugins and test suites which rely on the current behavior. While all tests pass, it is hard to predict what sort of consequences and breakage might come out from this change. As an example, in the past we tried to introduce a new "any" scope fixture, and while the entire test suite passed with the new fixture, it broke horribly when we tested it in the wild in existing test suites.

For those reasons, I'm afraid of merging in your changes, as much as well written and solid as they are, because I fear the aftermath might be very problematic. To be clear, my fear does not come from your code per-se, but from meddling with the fixture mechanism internals which we feel are not solid enough.

Having said that, we have on-going plans to improve the internals, all led by @RonnyPfannschmidt, starting by cleaning up mark internals, with the next one being improving the collection tree/config setup. In the future we certainly want to revisit the fixture mechanism to a more explicit approach, rather than the sort of implicit/lazy mechanism we have in place today, but this takes time and needs to be done slowly and carefully.

In summary, as much as I appreciate you taking your time on this, and I'm guessing it took you awhile to come up with this patch given the complicated fixture internals, my gut feeling is to reject this patch, at least for the time being.

I would hate if this discourages you from further contributing to pytest, as there are much more greener areas that can be worked on and improved significantly if you are still interested.

@arobbins805
Copy link
Author

Hi @nicoddemus ,

Thanks for your thoughtful reply. In response to your message, I would like to take a step back and describe four behaviors that I would like to see improved and find out what can be done going forward.

First, four behavior that I think may need to be changed are the following:

  1. Fixture teardown failures (an exception in the finalizer) should be associated with a test that uses a given fixture.
  2. Teardown calls should be made from the pytest_runtest_teardown hook.
  3. There should be one consistent scheme, regardless of parametrization and scope, for when teardowns occur. I suggest that they should be called by the last item that uses a given fixture OR whenever they are unused in the upcoming item.
  4. Fixtures of equal scope to a fixture that needs to be torn down should not be torn down by default but rather based on context. I agree with the current implementation that fixtures of lower scope need to be torn down when a higher scope is torn down.

Currently, depending on the scope of the fixture and use of parametrization, the first three characteristics have been implemented inconsistently. Although this PR implements the behaviors that I listed above, I think that this PR will break systems that rely on session scoped fixtures that are not parametrized to not teardown. For example, expensive, session scoped fixtures that are intermittently used could make large test suite runs significantly longer because they will be torn down many times. I agree that this PR is premature because it doesn't account for the conditions that I just described.

With the current implementation of pytest, my test suite, which uses session scoped fixture parametrization, is incorrectly reporting teardown failures. Therefore, I have a vested interest to see this changed, sooner rather than later. I am also interested in helping out with pytest and/or other pytest plugin packages. So, two questions that I have are:
How can I help in other aspects of pytest?
Is there anything that I can do to move this up the priority list or help design a better fix?

Thanks, Andrew

@arobbins805 arobbins805 reopened this Oct 22, 2018
@nicoddemus
Copy link
Member

Hi @arobbins805,

Sorry for the late reply!

In order to clear up our queue and let us focus on the active PRs, I'm closing this PR for now. I think the discussion about how this should be handled should happen in #4871.

Thanks for your work, the team definitely appreciates it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants