Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 41 additions & 1 deletion Readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -323,6 +323,46 @@ redis-benchmarks-spec-sc-coordinator RUNNING pid 27842, uptime 0:00:00
```


## Development

1. Install [pypoetry](https://python-poetry.org/) to manage your dependencies and trigger tooling.
```sh
pip install poetry
```

2. Installing dependencies from lock file

```
poetry install
```

### Running formaters

```sh
poetry run black .
```


### Running linters

```sh
poetry run flake8
```


### Running tests

A test suite is provided, and can be run with:

```sh
$ tox
```

To run a specific test:
```sh
$ tox -- utils/tests/test_runner.py
```

## License

redis-benchmark-specifications is distributed under the Apache 2 license - see [LICENSE](LICENSE)
redisbench-admin is distributed under the BSD3 license - see [LICENSE](LICENSE)
16 changes: 4 additions & 12 deletions redis_benchmarks_specification/__runner__/runner.py
Original file line number Diff line number Diff line change
Expand Up @@ -402,11 +402,7 @@ def process_self_contained_coordinator_stream(
benchmark_end_time, benchmark_start_time
)
)
logging.info(
"Printing client tool stdout output".format(
client_container_stdout
)
)
logging.info("Printing client tool stdout output")
print()
if args.flushall_on_every_test_end:
logging.info("Sending FLUSHALL to the DB")
Expand Down Expand Up @@ -529,15 +525,13 @@ def process_self_contained_coordinator_stream(
else:
if "redis-benchmark" in benchmark_tool:
os.remove(full_result_path)
logging.info(
"Removing temporary JSON file".format(full_result_path)
)
logging.info("Removing temporary JSON file")
shutil.rmtree(temporary_dir_client, ignore_errors=True)
logging.info(
"Removing temporary client dir {}".format(temporary_dir_client)
)

table_name = "Results for entire test-suite".format(test_name)
table_name = "Results for entire test-suite"
results_matrix_headers = [
"Test Name",
"Metric JSON Path",
Expand All @@ -557,9 +551,7 @@ def process_self_contained_coordinator_stream(
"aggregate-results.csv",
)
logging.info(
"Storing an aggregated results CSV into {}".format(
full_result_path, dest_fpath
)
"Storing an aggregated results CSV into {}".format(full_result_path)
)

csv_writer = CsvTableWriter(
Expand Down