-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Add Github Actions build job #63
base: main
Are you sure you want to change the base?
Conversation
I guess the nix build dir is on tmpfs... |
In the past I've had success with using a github action step that removes unnecessary stuff from the ubuntu runner. IDR what exactly I used back then though... |
defc40f
to
6b116f1
Compare
It worked ❤️ Seems like I can clean up ~11GiB very easily. (Also the build is reproducible, I cross-referenced with my local one.) |
6b116f1
to
f48dff6
Compare
Well unfortunately the "cached" rebuild also takes around 3 hours. It seems that the cache accesses get throttled relatively early:
Also it seems that uploading to the cache is also skipped after getting throttled, so no hope of future runs getting faster either... |
Have you tried making a cachix account and using the free tier? It's up to 5GB, auto garbage collected. Also has a GitHub CI API. |
To avoid a time limit (didn't know that was a thing never ran into it before) you can do a local build and push that to the cachix binary cache before running CI and it will speed it up significantly. Also having this binary cache means that users of the flake can add it as a substituter and pull changes that way instead of having to build it themselves. Edit: If you're able to avoid rebuilding most things you may have less trouble with throttling. Pushing large changes to the binary cache first will allow the CI build to copy what it can instead of building on the runner. |
I don't think any time limits are being hit, it's the disk space again. I would prefer to avoid cross-contaminating my local store and the cache. I think among other things, GHA would be useful for checking reproducability. Reproducability issues could get potentially masked if I uploaded anything from my local store. It seems that the failed build did manage to upload some stuff to cachix though, I will rerun it, let's see if it gets any better. |
Hm the cachix dashboard shows "0 B / 5 GiB", so not sure what the previous run actually uploaded... |
Well the build succeded in 2h, and according to the logs it's pulling stuff from cachix. Cachix dashboard still showing "0 B / 5 GiB", so that's borked. Let's run it again, maybe 3rd time is the charm :D |
And yeah, it completed in 3m, that's looking quite resonable. Next steps:
|
It does this for me too when I make a new cache. Eventually it does update, not sure why that happens 😅 |
Kernel compile takes 2h, was expecting a bit faster, it's 20 minutes on my machine 😞 Anyway, good to know. I wonder if there is a way to specifically blacklist some stuff from cachix? The final ISO is 1GiB+ and can easily be generated if everything else is cached. |
cfcddba
to
0ec7bef
Compare
The ISO should be reproducible and have the following SHA256 hash: TODO
0ec7bef
to
2df8e7e
Compare
Well why not 🤷 Let's see if we hit any time or storage limits 🙃