Skip to content

Add GH action for running tests against upstream dev #4583

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 12 commits into from
Nov 22, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 21 additions & 0 deletions .github/workflows/parse_logs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# type: ignore
import pathlib

files = pathlib.Path("logs").rglob("**/*-log")
files = sorted(filter(lambda x: x.is_file(), files))

message = "\n"

print("Parsing logs ...")
for file in files:
with open(file) as fpt:
print(f"Parsing {file.absolute()}")
data = fpt.read().split("test summary info")[-1].splitlines()[1:-1]
data = "\n".join(data)
py_version = file.name.split("-")[1]
message = f"{message}\n<details>\n<summary>\nPython {py_version} Test Summary Info\n</summary>\n\n```bash\n{data}\n```\n</details>\n"

output_file = pathlib.Path("pytest-logs.txt")
with open(output_file, "w") as fpt:
print(f"Writing output file to: {output_file.absolute()} ")
fpt.write(message)
129 changes: 129 additions & 0 deletions .github/workflows/upstream-dev-ci.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
name: CI
on:
push:
branches:
- master
pull_request:
branches:
- master
schedule:
- cron: "0 0 * * *" # Daily “At 00:00” UTC
workflow_dispatch: # allows you to trigger the workflow run manually

jobs:
upstream-dev:
name: upstream-dev
runs-on: ubuntu-latest
defaults:
run:
shell: bash -l {0}
strategy:
fail-fast: false
matrix:
python-version: ["3.8"]
steps:
- name: Cancel previous runs
uses: styfle/[email protected]
with:
access_token: ${{ github.token }}
- uses: actions/checkout@v2
- uses: conda-incubator/setup-miniconda@v2
with:
channels: conda-forge
mamba-version: "*"
activate-environment: xarray-tests
auto-update-conda: false
python-version: ${{ matrix.python-version }}
- name: Set up conda environment
run: |
mamba env update -f ci/requirements/py38.yml
bash ci/install-upstream-wheels.sh
conda list
- name: Run Tests
run: |
python -m pytest --verbose -rf > output-${{ matrix.python-version }}-log

- name: Upload artifacts
if: "failure()&&(github.event_name == 'schedule')&&(github.repository == 'pydata/xarray')" # Check the exit code of previous step
uses: actions/upload-artifact@v2
with:
name: output-${{ matrix.python-version }}-log
path: output-${{ matrix.python-version }}-log
retention-days: 5

report:
name: report
needs: upstream-dev
if: "always()&&(github.event_name == 'schedule')&&(github.repository == 'pydata/xarray')"
runs-on: ubuntu-latest
defaults:
run:
shell: bash
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: "3.x"
- uses: actions/download-artifact@v2
with:
path: /tmp/workspace/logs
- name: Move all log files into a single directory
run: |
rsync -a /tmp/workspace/logs/output-*/ ./logs
ls -R ./logs
- name: Parse logs
run: |
python .github/workflows/parse_logs.py
- name: Report failures
uses: actions/github-script@v3
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
script: |
const fs = require('fs');
const pytest_logs = fs.readFileSync('pytest-logs.txt', 'utf8');
const title = "⚠️ Nightly upstream-dev CI failed ⚠️"
const workflow_url = `https://github.com/${process.env.GITHUB_REPOSITORY}/actions/runs/${process.env.GITHUB_RUN_ID}`
const issue_body = `[Workflow Run URL](${workflow_url})\n${pytest_logs}`

// Run GraphQL query against GitHub API to find the most recent open issue used for reporting failures
const query = `query($owner:String!, $name:String!, $creator:String!, $label:String!){
repository(owner: $owner, name: $name) {
issues(first: 1, states: OPEN, filterBy: {createdBy: $creator, labels: [$label]}, orderBy: {field: CREATED_AT, direction: DESC}) {
edges {
node {
body
id
number
}
}
}
}
}`;

const variables = {
owner: context.repo.owner,
name: context.repo.repo,
label: 'CI',
creator: "github-actions[bot]"
}
const result = await github.graphql(query, variables)
const issue_info = result.repository.issues.edges[0].node

// If no issue is open, create a new issue, else update the
// body of the existing issue.
if (typeof issue_info.number === 'undefined') {
github.issues.create({
owner: variables.owner,
repo: variables.name,
body: issue_body,
title: title,
labels: [variables.label]
})
} else {
github.issues.update({
owner: variables.owner,
repo: variables.name,
issue_number: issue_info.number,
body: issue_body
})
}
Comment on lines +88 to +129
Copy link
Member Author

@andersy005 andersy005 Nov 21, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am confident this gives us what we want in terms of reporting failures:, i.e.:

  1. Find the most recently created issue (used for reporting failures)
  2. If there's an existing and open issue, update the body of the issue with the most recent summary failures..
  3. Else if there's no candidate, open issue, create a new issue and post the summary information of the failing tests

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very impressive. I think this is the best of all worlds.

39 changes: 39 additions & 0 deletions ci/install-upstream-wheels.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
#!/usr/bin/env bash

conda uninstall -y --force \
numpy \
scipy \
pandas \
matplotlib \
dask \
distributed \
zarr \
cftime \
rasterio \
pint \
bottleneck \
sparse
python -m pip install \
-i https://pypi.anaconda.org/scipy-wheels-nightly/simple \
--no-deps \
--pre \
--upgrade \
numpy \
scipy \
pandas
python -m pip install \
-f https://7933911d6844c6c53a7d-47bd50c35cd79bd838daf386af554a83.ssl.cf2.rackcdn.com \
--no-deps \
--pre \
--upgrade \
matplotlib
python -m pip install \
--no-deps \
--upgrade \
git+https://github.com/dask/dask \
git+https://github.com/dask/distributed \
git+https://github.com/zarr-developers/zarr \
git+https://github.com/Unidata/cftime \
git+https://github.com/mapbox/rasterio \
git+https://github.com/hgrecco/pint \
git+https://github.com/pydata/bottleneck
2 changes: 2 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -126,6 +126,8 @@ Internal Changes
- Replace the internal use of ``pd.Index.__or__`` and ``pd.Index.__and__`` with ``pd.Index.union``
and ``pd.Index.intersection`` as they will stop working as set operations in the future
(:issue:`4565`). By `Mathias Hauser <https://github.com/mathause>`_.
- Add GitHub action for running nightly tests against upstream dependencies (:pull:`4583`).
By `Anderson Banihirwe <https://github.com/andersy005>`_.

.. _whats-new.0.16.1:

Expand Down