-
-
Notifications
You must be signed in to change notification settings - Fork 23
Dataset cache purge cronjob #1211
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Changes from all commits
cbf5b54
c7f438f
4c7d8ae
946cca3
628488c
97a7d55
4d609d9
25d022e
30ccd81
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,24 @@ | ||
| apiVersion: batch/v1 | ||
ponyisi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| kind: CronJob | ||
| metadata: | ||
| name: {{ .Release.Name }}-dataset-lifecycle-job | ||
| spec: | ||
| schedule: {{ .Values.datasetLifecycle.schedule | default "0 * * * *" | quote }} | ||
| concurrencyPolicy: "Forbid" | ||
| jobTemplate: | ||
| spec: | ||
| template: | ||
| metadata: | ||
| labels: | ||
| app: {{ .Release.Name }}-dataset-lifecycle-job | ||
| spec: | ||
| containers: | ||
| - name: {{ .Release.Name }}-dataset-lifecycle-job | ||
| image: {{ .Values.datasetLifecycle.image }}:{{ .Values.datasetLifecycle.tag }} | ||
| imagePullPolicy: {{ .Values.datasetLifecycle.pullPolicy }} | ||
| env: | ||
| - name: LIFETIME | ||
| value: {{ .Values.datasetLifecycle.cacheLifetime }} | ||
ponyisi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| args: | ||
| - --request POST "http://{{ .Release.Name }}-servicex-app:8000/servicex/internal/dataset-lifecycle?age=$(LIFETIME)" | ||
| restartPolicy: OnFailure | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,4 +1,4 @@ | ||
| # Copyright (c) 2024, IRIS-HEP | ||
| # Copyright (c) 2024-25, IRIS-HEP | ||
ponyisi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| # All rights reserved. | ||
| # | ||
| # Redistribution and use in source and binary forms, with or without | ||
|
|
@@ -30,18 +30,22 @@ | |
| from servicex_app.resources.servicex_resource import ServiceXResource | ||
|
|
||
|
|
||
| class DeleteDataset(ServiceXResource): | ||
| @auth_required | ||
| def delete(self, dataset_id): | ||
| dataset = Dataset.find_by_id(dataset_id) | ||
| def delete_dataset(dataset_id): | ||
| dataset = Dataset.find_by_id(dataset_id) | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Move this to models.py since this now needs to be reused between two resources. We shouldn't be calling methods in one resource from another |
||
|
|
||
| if not dataset: | ||
| return {"message": f"Dataset {dataset_id} not found"}, 404 | ||
|
|
||
| if not dataset: | ||
| return {"message": f"Dataset {dataset_id} not found"}, 404 | ||
| if dataset.stale: | ||
| return {"message": f"Dataset {dataset_id} has already been deleted"}, 400 | ||
|
|
||
| if dataset.stale: | ||
| return {"message": f"Dataset {dataset_id} has already been deleted"}, 400 | ||
| dataset.stale = True | ||
| dataset.save_to_db() | ||
|
|
||
| dataset.stale = True | ||
| dataset.save_to_db() | ||
| return {"dataset-id": dataset_id, "stale": True}, 200 | ||
|
|
||
| return {"dataset-id": dataset_id, "stale": True} | ||
|
|
||
| class DeleteDataset(ServiceXResource): | ||
| @auth_required | ||
| def delete(self, dataset_id): | ||
| return delete_dataset(dataset_id) | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,61 @@ | ||
| # Copyright (c) 2025, IRIS-HEP | ||
| # All rights reserved. | ||
| # | ||
| # Redistribution and use in source and binary forms, with or without | ||
| # modification, are permitted provided that the following conditions are met: | ||
| # | ||
| # * Redistributions of source code must retain the above copyright notice, this | ||
| # list of conditions and the following disclaimer. | ||
| # | ||
| # * Redistributions in binary form must reproduce the above copyright notice, | ||
| # this list of conditions and the following disclaimer in the documentation | ||
| # and/or other materials provided with the distribution. | ||
| # | ||
| # * Neither the name of the copyright holder nor the names of its | ||
| # contributors may be used to endorse or promote products derived from | ||
| # this software without specific prior written permission. | ||
| # | ||
| # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" | ||
| # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE | ||
| # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE | ||
| # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE | ||
| # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL | ||
| # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR | ||
| # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER | ||
| # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, | ||
| # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE | ||
| # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. | ||
| from datetime import datetime, timedelta, timezone | ||
|
|
||
| from flask import request, current_app | ||
|
|
||
| from servicex_app.resources.servicex_resource import ServiceXResource | ||
| from ..datasets.get_all import get_all_datasets | ||
| from ..datasets.delete_dataset import delete_dataset | ||
|
|
||
|
|
||
| class DatasetLifecycleOps(ServiceXResource): | ||
| def post(self): | ||
| """ | ||
| Obsolete cached datasets older than N hours | ||
| """ | ||
| now = datetime.now(timezone.utc) | ||
| try: | ||
| age = float(request.get_json().get("age", 24)) | ||
| except Exception: | ||
ponyisi marked this conversation as resolved.
Show resolved
Hide resolved
ponyisi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
| return {"message": "Invalid age parameter"}, 422 | ||
| delta = timedelta(hours=age) | ||
| datasets = ( | ||
| get_all_datasets() | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why can't this be |
||
| ) # by default this will only give non-stale datasets | ||
| todelete = [ | ||
| _.id for _ in datasets if _.last_updated and (now - _.last_updated) > delta | ||
| ] | ||
| current_app.logger.info( | ||
| f"Obsoletion called for datasets older than {delta}. " | ||
| f"Obsoleting {len(todelete)} datasets." | ||
| ) | ||
| for dataset_id in todelete: | ||
| delete_dataset(dataset_id) | ||
ponyisi marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
|
||
| return {"message": "Success"}, 200 | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,87 @@ | ||
| # Copyright (c) 2025, IRIS-HEP | ||
| # All rights reserved. | ||
| # | ||
| # Redistribution and use in source and binary forms, with or without | ||
| # modification, are permitted provided that the following conditions are met: | ||
| # | ||
| # * Redistributions of source code must retain the above copyright notice, this | ||
| # list of conditions and the following disclaimer. | ||
| # | ||
| # * Redistributions in binary form must reproduce the above copyright notice, | ||
| # this list of conditions and the following disclaimer in the documentation | ||
| # and/or other materials provided with the distribution. | ||
| # | ||
| # * Neither the name of the copyright holder nor the names of its | ||
| # contributors may be used to endorse or promote products derived from | ||
| # this software without specific prior written permission. | ||
| # | ||
| # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" | ||
| # AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE | ||
| # IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE | ||
| # DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE | ||
| # FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL | ||
| # DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR | ||
| # SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER | ||
| # CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, | ||
| # OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE | ||
| # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. | ||
| from datetime import datetime, timezone | ||
| from unittest.mock import patch | ||
|
|
||
| from pytest import fixture | ||
|
|
||
| from servicex_app.models import Dataset | ||
|
|
||
| from servicex_app_test.resource_test_base import ResourceTestBase | ||
|
|
||
|
|
||
| class TestDatasetLifecycle(ResourceTestBase): | ||
| @fixture | ||
| def fake_dataset_list(self): | ||
| with patch( | ||
| "servicex_app.resources.internal.dataset_lifecycle_ops.get_all_datasets" | ||
| ) as dsfunc: | ||
| dsfunc.return_value = [ | ||
| Dataset( | ||
| last_used=datetime(2022, 1, 1, tzinfo=timezone.utc), | ||
| last_updated=datetime(2022, 1, 1, tzinfo=timezone.utc), | ||
| id=1, | ||
| name="not-orphaned", | ||
| events=100, | ||
| size=1000, | ||
| n_files=1, | ||
| lookup_status="complete", | ||
| did_finder="rucio", | ||
| ), | ||
| Dataset( | ||
| last_used=datetime.now(timezone.utc), | ||
| last_updated=datetime.now(timezone.utc), | ||
| id=2, | ||
| name="orphaned", | ||
| events=100, | ||
| size=1000, | ||
| n_files=1, | ||
| lookup_status="complete", | ||
| did_finder="rucio", | ||
| ), | ||
| ] | ||
| yield dsfunc | ||
|
|
||
| def test_fail_on_bad_param(self, client): | ||
| with client.application.app_context(): | ||
| response = client.post( | ||
| "/servicex/internal/dataset-lifecycle", json={"age": "string"} | ||
| ) | ||
| assert response.status_code == 422 | ||
|
|
||
| def test_deletion(self, fake_dataset_list, client): | ||
| with client.application.app_context(): | ||
| with patch( | ||
| "servicex_app.resources.internal.dataset_lifecycle_ops.delete_dataset" | ||
| ) as deletion_obj: | ||
| response = client.post( | ||
| "/servicex/internal/dataset-lifecycle", json={"age": 24} | ||
| ) | ||
| fake_dataset_list.assert_called_once() | ||
| deletion_obj.assert_called_once() | ||
| assert response.status_code == 200 |
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I can't imagine a case where we would want a different image for initiating the dataset clean up as we use for the data lifecycle job.
In fact, I had imagined dataset cleanup to be a subset of data lifecycle, so ideally the schedule should just be a subitem of
dataLifecycleSo, maybe
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The data lifecycle and dataset cache tasks are conceptually quite separate, one is handling ServiceX outputs and one is handling input replicas and there's no real reason their schedules should be at all related. In fact you might want to disable the data lifecycle and keep the dataset cache cleaning. (I had considered doing a single task but unless we make it explicitly a single "general" job that just does all periodic cleanups I think this is adding unhelpful complication.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The data lifecycle and dataset cache tasks are conceptually quite separate, one is handling ServiceX outputs and one is handling input replicas and there's no real reason their schedules should be at all related. In fact you might want to disable the data lifecycle and keep the dataset cache cleaning. (I had considered doing a single task but unless we make it explicitly a single "general" job that just does all periodic cleanups I think this is adding unhelpful complication.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree they both need their own schedules and could be enabled separately - all I'm saying is that they don't need different docker images, so if we collect these two operations in the values.yaml as nested under a common root level property that way they can share the docker image specification. My formatting in the above comment got messed up. I'll edit it