We are using close to 7TiB of data on gitlab for our artifacts, I think we should start thinking about a cleanup strategy.
did you clean the registry? I remember there being a ton of images there
also we should have a release step to change the tag name so that we can more easily recognize which images are for release branches
Yes I did, a 200 GiB less
there is auto-clean now
but for artifacts I'm at a loss
I mean there is auto-clean for the registry as a feature, but I didn't enable auto-clean
just a clean
I'm still playing with gitlab CI Inria, @Gaëtan Gilbert but I need for them to raise the artifact size lmit
limit
otherwise seems much faster
we don't have expire on ci-template, probably why there's so much stuff
oh but artifacts auto expire 1 month since june 2020 https://docs.gitlab.com/ee/user/gitlab_com/index.html#gitlab-cicd
Yes, i'd be great if we could find the artifact manager and clean up a bit
7 TiB seems a bit too much to me
we have about 600 pipelines last 30 days so that's about 11GB / pipeline
seems plausible
with 90 jobs / pipeline
Coq universe took like 2 GiB artifacts
but we are not uploading the artifacts of the leaves, right?
we do
artifacts:
name: "$CI_JOB_NAME"
paths:
- _build_ci
when: always
dune ci template has 2 months expires
Aha
Artifacts of the leaves need to be uploaded so that the bug minimizer works.
Ah that was the reason, thanks for the clarification @Jason Gross
Last updated: Dec 07 2023 at 17:01 UTC