You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 9, 2020. It is now read-only.
Starting this issue to track definition and execution of an abstract test harness for the integration tests. Came from discussion in #265. My initial thoughts are below, all of which are open to discussion/debate:
Goal: Create an integration test system that moves the individual test case definitions to the file system instead of in code. The test code would simply search the file system to discover the test cases, and run each in turn.
A test case would consist of:
A test name
An initial state
Local code
Possible manifest and/or lock files
Possible previous imports
Possible vendor directory structure
A series of dep commands to be run
A series of expected command responses
An expected final state
Manifest and/or lock files
Vendor directory structure
If there are any other factors in the project state that are not included above, please comment.
My proposal for the filesystem to store the above would be to walk down the dirtree from testdata, looking for testcases, identified by the presence of commands.txt below. Once a directory is found, the test case would be:
Test name: the discovered filepath after testdata
Initial state
Local code, manifest, and lock files in an initial directory
Import list in an initial/imports.txt file (or other convenient format)
Vendor directories in a initial/vendor.txt file (or other convenient format)
Dep commands to be run and expected reponses in a commands.txt file of some sort
Expected final state
Manifest and lock files in a final directory
Vendor directories in a final/vendor.txt file (only checking for existence or extras)
If final does not exist, the files from initial are used (in case of expected command error, e.g.)
So for example, the ensure/override/case21 test case might look like:
The test name and subtest pattern will provide easy test run granularity when needed. The initial directory can be simply copied in its entirety, assuming the imports and vendor files are innocuous to dep. I am still a little unclear on the interplay with the imports (global-level projects). Should there be two kinds of imports: local and remote (ie locally-generated vs remotely-retrieved)? If local imports are needed, then the above structure might be better with an imports directory with both code to be migrated to the test-local src as well as imports.txt to get any remote repos.
Thoughts welcome. One question I have - will the above result in a sprawling testdata directory, and if so, is that a problem?
The text was updated successfully, but these errors were encountered:
Thx. You can see the progress, open for comment, on #287. The latest thing I am working on is a more structured way of populating vendor/ before the test and seeing the effects. Will commit first steps soon, but I am seeing some oddity and want to verify it is not the tests first.
Uh oh!
There was an error while loading. Please reload this page.
Starting this issue to track definition and execution of an abstract test harness for the integration tests. Came from discussion in #265. My initial thoughts are below, all of which are open to discussion/debate:
Goal: Create an integration test system that moves the individual test case definitions to the file system instead of in code. The test code would simply search the file system to discover the test cases, and run each in turn.
A test case would consist of:
If there are any other factors in the project state that are not included above, please comment.
My proposal for the filesystem to store the above would be to walk down the dirtree from
testdata
, looking for testcases, identified by the presence ofcommands.txt
below. Once a directory is found, the test case would be:testdata
initial
directoryinitial/imports.txt
file (or other convenient format)initial/vendor.txt
file (or other convenient format)commands.txt
file of some sortfinal
directoryfinal/vendor.txt
file (only checking for existence or extras)final
does not exist, the files frominitial
are used (in case of expected command error, e.g.)So for example, the
ensure/override/case21
test case might look like:The test name and subtest pattern will provide easy test run granularity when needed. The
initial
directory can be simply copied in its entirety, assuming theimports
andvendor
files are innocuous todep
. I am still a little unclear on the interplay with the imports (global-level projects). Should there be two kinds of imports: local and remote (ie locally-generated vs remotely-retrieved)? If local imports are needed, then the above structure might be better with animports
directory with both code to be migrated to the test-localsrc
as well asimports.txt
to get any remote repos.Thoughts welcome. One question I have - will the above result in a sprawling
testdata
directory, and if so, is that a problem?The text was updated successfully, but these errors were encountered: