Skip to content

Pytest #1722

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 84 commits into from
Dec 10, 2016
Merged

Pytest #1722

merged 84 commits into from
Dec 10, 2016

Conversation

djarecka
Copy link
Collaborator

@djarecka djarecka commented Nov 26, 2016

I removed nose and unittest libraries; and rewrote the tests to pytest. I still kept mock library in a few tests. I tried to not change the tests coverage, they should tests exactly the same functionality. However, py.test stops execution after the first assert statement that fails.

This is still not completely finished:

  • there are some notes (search for "NOTE_dj") that should be removed, but just kept them for now to think and discuss.
  • there are 2 places that I marked as "FAIL", I had to comment them to not get Fails... not sure if I should change the tests or the tested function.
  • a few tests are marked as xfail, they were skipif(True) before.
  • there are tests that are always skipped (I used the test docker containers)
  • circle.yml runs run_pytests.py instead of run_nosetests.py, run_pytests.py needs to be improved

I also slightly adjusted docstrings so doctests with pytest can be run.

@codecov-io
Copy link

codecov-io commented Nov 26, 2016

Current coverage is 71.11% (diff: 62.06%)

Merging #1722 into master will increase coverage by 0.01%

@@             master      #1722   diff @@
==========================================
  Files          1030       1028      -2   
  Lines         51927      50608   -1319   
  Methods           0          0           
  Messages          0          0           
  Branches       7346       7331     -15   
==========================================
- Hits          36919      35988    -931   
+ Misses        13883      13495    -388   
  Partials       1125       1125           

Powered by Codecov. Last update 18468d1...7b00617

Copy link
Contributor

@oesteban oesteban left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great effort! Thanks a lot!

@@ -11,7 +10,7 @@ def test_NormalizeProbabilityMapSet_inputs():

for key, metadata in list(input_map.items()):
for metakey, value in list(metadata.items()):
yield assert_equal, getattr(inputs.traits()[key], metakey), value
assert getattr(inputs.traits()[key], metakey) == value
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

are these changes done in the tools/checkspecs.py file? If not, they will go away with make specs.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes the tools/checkspecs.py has changed, all auto tests are generated by running python2 tools/checkspecs.py

I know this PR is almost impossible to read, I have another branch pytest_oldAutoTests (and try to keep updated) without new auto tests, so I can compare other changes:
https://github.com/djarecka/nipype/compare/master...djarecka:pytest_oldAutoTests?expand=1#diff-e661fcf16301968039679532e45d0224

timeout: 2600
- docker run -v /etc/localtime:/etc/localtime:ro -e FSL_COURSE_DATA="/root/examples/nipype-fsl_course_data" -v ~/examples:/root/examples:ro -v ~/scratch:/scratch -w /root/src/nipype nipype/nipype_test:py27 /usr/bin/run_nosetests.sh py27 :
timeout: 5200
# - docker run -v /etc/localtime:/etc/localtime:ro -e FSL_COURSE_DATA="/root/examples/nipype-fsl_course_data" -v ~/examples:/root/examples:ro -v ~/scratch:/scratch -w /root/src/nipype nipype/nipype_test:py35 /usr/bin/run_nosetests.sh py35 :
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is actually disabling tests in CircleCI. Could you modify run_nosetests.sh (https://github.com/nipy/nipype/blob/master/docker/circleci/run_nosetests.sh) so that it calls py.test inside?

If you rename that file, make sure it starts with the prefix run_ so it is automatically added in the docker image.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, I know :) I was still modifying and testing on other branch without running all examples etc. (so it doesn't run 4h). I've just pushed a new version that runs run_pytests.sh (plus all examples; so let's see if it's ok)

Copy link
Collaborator Author

@djarecka djarecka Nov 28, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my run_pytest.sh is far from being perfect, I'd need some help (left some notes within the file). Mostly I'm not sure if I should install pytest and other library using this file or docker containers might be changed. I also don't know how to recreate the codecov functionality: for now I removed xml files and testing pytest-cov.

@satra
Copy link
Member

satra commented Nov 28, 2016

@djarecka
Copy link
Collaborator Author

djarecka commented Nov 28, 2016

@satra : Thanks, will read. In general, I'm using pytest-cov: https://github.com/djarecka/nipype/blob/pytest/docker/circleci/run_pytests.sh#L33
I'm just not sure about this xml files etc. For now I've removed codecov, but pytest should give a coverage report.

@djarecka djarecka changed the title [WIP] Pytest Pytest Dec 2, 2016
conda update --all -y python=$TRAVIS_PYTHON_VERSION pytest &&
pip install pytest-raisesregexp &&
pip install pytest-cov &&
pip install click &&
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is unnecessary - should happen with the requirements.txtfile.

pip install -U pytest
pip install pytest-raisesregexp
pip install pytest-cov
pip install click
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

click is a nipype requirement. and so should pytest now. these should be updated in nipype/info.py and *requirements.txt

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

had to add click installation once I've started testing doctest. Had errors from nipype/scripts/cli.py and nipype/scripts/utils.py.
When I run containers on my laptop I do have an import error when I try import click


class TestTSNR(unittest.TestCase):
#NOTE_dj: haven't removed mock (have to understand better)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can remove this note - mock is used to simulate some interaction that may not exist on the system.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

however if it's not used in a particular file, it should be removed.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had this note, since I was thinking about trying to use pytest mock and monkeypatching instead of the mock library. I will remove the note and keep the library for now. Can try to change it in different PR, once I understand pytest better.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I haven't been super happy with the mock library, so if you do check out pytest mock, tell me what you think!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure, I'll give a second try. Since I've never used any mocking or patching, I had problem with understanding how important the differences between the libraries might be for the project.
I will open a new PR for it, so it will be easier to review.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool--it's not a priority or anything :)

@satra
Copy link
Member

satra commented Dec 3, 2016

@djarecka - this looks great and is almost there.

if you can:

  • merge master and
  • address any of the comments that require addressing, and
  • the tests pass,

we can merge it.

@djarecka djarecka force-pushed the pytest branch 2 times, most recently from 6f2f9cb to 112ffc3 Compare December 7, 2016 22:21
@djarecka
Copy link
Collaborator Author

djarecka commented Dec 7, 2016

@satra:
after you comments, I've changed:

  • updated various requirements files, including requirements.txt : removing nosetest, doctest-ignore-unicode and adding pytest; deleting run_nosetest.
  • updated Makefile, removed test-doc
  • changed description of the testing process within testing_nipype.rst
  • removed installation of pytest from travis and run_pytest.sh (travis and circle uses requiremnts.txt)
  • removed my comments that described my changes and some small concerns (might come back to them later)
  • removed import nipype.testing when numpy.testing methods are being used
  • changed skipif to use sys.version_info and not TRAVIS_PYTHON_VERSION, so it works also for make tests on a local machine

some issues that I still don't know how to solve (but I believe PR can me merged regardless):

  • the only comments I left are related to test failures (NOTE_dj, FAIL), two asserts are disabled, since they don't work, either the tests or the code has to be changed.
  • there is one doctest that in some situation might not work: https://github.com/djarecka/nipype/blob/pytest/nipype/interfaces/bru2nii.py#L45
    Travis and Circle are ok, but the output might vary.
  • there is a file .coveragerc that is only used in Makefile. Should it be use also in travis and circle?
  • I didn't change install nosetest to install pytest in Dockerfiles, since I'm not sure if this should stay there (some dockerfiles have only install requirements.txt)
  • don't now what to do with nipype/fixes/numpy/testing/nosetester.py and /nipype/fixes/numpy/testing/noseclasses.py. can be just removed??
  • not sure about nipype/testing: most of the things are not used anymore. I can see only example_data, anatfile and funcfile
  • tools/report_coverage.py should be changed or removed (had problem running even the original version, so not sure if anybody cares about the file anymore)

@satra
Copy link
Member

satra commented Dec 8, 2016

we should change the docker files as well.

@djarecka
Copy link
Collaborator Author

djarecka commented Dec 8, 2016

@satra: can remove pip install nose-cov doctest-ignore-unicode configparser from Dockerfile_base and Dockerfile_py27. Changing to py.test is not needed IMO, since they run pip install -r /root/src/nipype/requirements.txt (and Dockerfile_py35 doesn't have this part anyway).

Let me know, if you had something else/more in mind.

@satra
Copy link
Member

satra commented Dec 8, 2016

i think configparser may need to stay for py27

…arametrize, fixture, skipif, shortening a few functions
…ng asserts, fixture for creating files, some problems with exceptions
… but adding pdb.set_trace() or running py.test -s could change the output of the test
…st with and without output capturing should work
…ert, not yield asser_equal; all test_auto* were changed (running python2 tools/checkspec
…ype, have some errors/fails that could be related to yield)
@djarecka
Copy link
Collaborator Author

djarecka commented Dec 9, 2016

@satra all tests were passed after I changed Dockerfiles yesterday, but today I've discovered that my PR can't be merged automatically, since one new test was merged. I rebased again and running tests, but this PR will always have conflict whenever any single test will be changed on the master.

@satra
Copy link
Member

satra commented Dec 9, 2016

@djarecka - we will merge this as soon as circle passes.

@satra satra merged commit 8ff1851 into nipy:master Dec 10, 2016
@chrisgorgo
Copy link
Member

chrisgorgo commented Dec 10, 2016 via email

@djarecka
Copy link
Collaborator Author

@chrisfilo: :-)

import nipype.interfaces.freesurfer as freesurfer


def create_files_in_directory():
@pytest.fixture()
def create_files_in_directory(request):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can this fixture be creating in nipype.testing and imported from there. seems like this gets repeated in a couple of files.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it can be, let me do it in a separate PR, I will review some other fixtures. I believe this is not the only one that is often repeated

@@ -1,3 +1,6 @@
# AUTO-GENERATED by tools/checkspecs.py - DO NOT EDIT
from nipype.testing import assert_equal
from nipype.interfaces.fsl.dti import BEDPOSTX

#NOTE_dj: this test has only import statements
#NOTE_dj: this is not a AUTO test!
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is ok. it is used only to ensure that the autotest is not written. the BEDPOSTX interface that's instantiated depends on the version of FSL installed in the system.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've restored the file, but change the comment. However, I did not notice any new test_auto_BEDPOSTX .

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

checking this issue, I've discovered that I have one auto test that is untracked - nipype/algorithms/tests/test_auto_ACompCor.py. You don't have it?

@@ -46,11 +45,13 @@ def test_filmgls():
use_pava=dict(argstr='--pava',),
)
instance = FILMGLS()
#NOTE_dj: don't understand this test:
#NOTE_dj: IMO, it should go either to IF or ELSE, instance doesn't depend on any parameters
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

similar to BEDPOSTX, the specific FILMGLS class depends on the instance of FSL.

@@ -1,3 +1,5 @@
# -*- coding: utf-8 -*-
from nipype.testing import assert_equal
from nipype.interfaces.fsl.dti import XFibres

#NOTE_dj: this test contains import statements only...
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same as BEDPOSTX

@skipif(no_fsl) # skip if fsl not installed)

#NOTE_dj: a function, e.g. "no_fsl" always gives True, shuld always change to no_fsl in pytest skipif
#NOTE_dj: removed the IF statement, since skipif is used?
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good.



# Test AWS creds read from env vars
@skipif(noboto3 or not fakes3)
#NOTE_dj: noboto3 and fakes3 are not used in this test, is skipif needed for other reason?
@pytest.mark.skipif(noboto3 or not fakes3, reason="boto3 or fakes3 library is not available")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the test should not be run if boto or fakes3 are not available. they don't have to be used in the test.

def test_jsonsink():
import simplejson
import os
#NOTE_dj: I split the test_jsonsink, didn't find connection between two parts, could easier use parametrize for the second part
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok

#NOTE_dj: can we change the imports, so it's more clear where the function come from
#NOTE_dj: in ...testing there is simply from numpy.testing import *
from ...testing import utils
from numpy.testing import assert_almost_equal
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

better.

from .. import nilearn as iface
from ...pipeline import engine as pe

import pytest, pdb
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is pdb being used?

from ... import engine as pe
from ....interfaces import base as nib

#NOTE_dj: I combined some tests that didn't have any extra description
#NOTE_dj: some other test can be combined but could be harder to read as examples of usage
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok.

@satra
Copy link
Member

satra commented Dec 15, 2016

@djarecka - i can't find the comments you are posting for some reason - but i receive emails. auto-tests are only generated when a corresponding named test is not available. test_auto_xyz.py will only be created if test_xyz.py does not exist. such custom tests often, but not always, start from an auto test and then modify it.

@djarecka
Copy link
Collaborator Author

@satra : my point was that after removing test_XFibres.py and test_BEDPOSTX.py I could not see any differences in auto tests. If you still expect that some new auto tests should be created when these 2 files are missing I can explore it later.

And independent question was about nipype/algorithms/tests/test_auto_ACompCor.py. it looks like checkspecs.py generates this file for me (it's independent on test_XFibres.py and test_BEDPOSTX.py files). Was wondering if you don't have this file. It is not tracked by git.

@djarecka
Copy link
Collaborator Author

@satra : the reason why I haven't seen any difference after removing test_XFibres.py and test_BEDPOSTX.py, i.e. no new auto tests were created, is here https://github.com/nipy/nipype/blob/master/nipype/interfaces/fsl/dti.py#L428
checkspecs.py consider XFibres5 and BEDPOSTX5 as valid classes, and not XFibres and BEDPOSTX.
The auto tests for XFibres5 and BEDPOSTX5 have been created long time ago. They work, but if you want to remove them, I can create test_BEDPOSTX5.py and test_XFibres5.py and remove the auto tests.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants