Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci: add prune job for packagecloud pruning #147

Open
wants to merge 7 commits into
base: main
Choose a base branch
from

Conversation

amdprophet
Copy link
Contributor

Adds a new prune job to GitHub Actions to remove packages from our ci-builds repository in Packagecloud. It is set to run every day at 13:22 (1:30 PM) in the UTC timezone (3:22 AM PDT / 4:22 PST).

Signed-off-by: Justin Kolberg amd.prophet@gmail.com

Signed-off-by: Justin Kolberg <amd.prophet@gmail.com>
Signed-off-by: Justin Kolberg <amd.prophet@gmail.com>
Signed-off-by: Justin Kolberg <amd.prophet@gmail.com>
Signed-off-by: Justin Kolberg <amd.prophet@gmail.com>
Signed-off-by: Justin Kolberg <amd.prophet@gmail.com>
Signed-off-by: Justin Kolberg <amd.prophet@gmail.com>
Signed-off-by: Justin Kolberg <amd.prophet@gmail.com>
@amdprophet amdprophet requested review from a team as code owners January 28, 2025 21:00
with:
go-version: stable

- name: Run packagecloudpruner
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Isn't there a package retention policy that we could configure on packagecloud?


log.Info("fetching next page of packages")

if p.perPage == "" {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: do we need a check to guard against going over the pagesize limit?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

was also curious about this, a check is prob not needed

the doc here says:

Any per_page parameter greater than Max-Per-Page will be ignored 
and the maximum set of items allowed per page will be returned instead.

seems like the api won't return more than what's available 🤔 unless i'm understanding it wrong

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants