Some things work better without CI

Reject "DevOps", embrace "just running stuff"

Hello! Hi! I've been exhaustingly busy with a paper these last few weeks but I've handed it in for peer review now, and while I'm not out of the blue with work just yet, I figured I could squeeze a little blog post in, as a treat.

You probably didn't notice it, but I've migrated this blog from GitHub to Codeberg, which offers a similar, though slightly more flexible pages service for static website hosting. It is a little more flexible in domain settings in particular: GitHub pages didn't allow me to use leftfold.tech as a domain when I already used it for email. That's why I had to route leftfold.tech through my own VPS and set up a permanent redirect to www.leftfold.tech in my nginx config. Codeberg now does this for me.

I haven't said goodbye to GitHub officially yet, but I plan to get there by the end of the year. In any case, all new personal projects of mine will be created on Codeberg. Having joined the association and met some of the Codeberg people in person at GPN22, I'm also planning to get more involved with the project in general in the near future. That's a topic for another post (that I will probably write at the end of the year), though. This one is about a change I made in the setup for this website, one that I think made a lot of sense.

Friendship ended with GitHub Actions

The way I set up my blog on GitHub was fairly standard: I used an action that would run every time I pushed a change. That action would fetch external data for my now page, the RSS feeds from my blogroll, build the website, and finally upload it as an artifact to publish to GitHub pages. To work with data from Spotify, I built quite the concoction in my action. It was perhaps a little complicated, but it worked: my blog was updated for every change I pushed, and additionally once every night. A lot of statically-generated sites on GitHub use a set up like this. You push a change, GitHub builds and publishes the site.

So, what's bad about using actions like this?

The primary pain point for me was that it is slow. You just want to fix a small typo, but now you have to wait for GitHub to assign you a runner, to set up the environment, to perform any preprocessing steps (like fetching the external data in my case), to build and finally upload the site. And you'd better watch that little yellow circle, because otherwise you won't notice that something went wrong and your build failed. Let's hope that you won't waste too much time clicking through the logs in the clunky web interface, in that case. Rinse and repeat that whole process for every tiny thing you change.

But CI here is not only slow, it's also expensive. To recycle an old joke about the cloudā„¢: "there is no CI – just someone else's computer". Your GitHub actions workflow runs in some Microsoft data centre in the middle of the desert, and includes inherent overhead by virtue of having to be scheduled, isolated, controlled, and cleaned up. And all that, just to turn Markdown into HTML? Really? Worse yet, only companies like Microsoft can afford free CI like this. Codeberg provides CI (upon request), but obviously an independent, non-profit association will never have the capacity to burn cash (and the planet) like Microsoft. Smaller players will never be able to provide such efficient CI. And it's still slow!

On top of this, there are a bunch of smaller problems with CI: it's difficult to test and reproduce locally, and you fall victim to vendor lock-in.

So, all this is kinda dumb, isn't it? Every time I want to update my website, I tell someone else's non-interoperable computer to run a bunch of commands... that I'm already running on my own computer, all the goddamn time. Why has this become our default way of thinking?

Now bash is my best friend

I know how to build my site (invoke the static site generator), and I know how to put static files on Codeberg pages (push to a branch named pages). I can do both on my computer(s), very fast.

So... why didn't I just write a script that does this?

  1. Build the site from the Markdown (or whatever else) by calling the static site generator
  2. Copy the output to the working tree of the pages branch
  3. Commit and push

To be honest, I don't know. It seemed rather obvious after thinking about it. But in the end, that's what I did: I wrote a bash script for it.

What about the external data?

Right, one small problem: my site building process isn't quite that simple (though yours might be!). One thing that previously wasted time and resources was the fact that the actions workflow was fetching data from Spotify, Bookwyrm and various RSS feeds every time it was executed. Ideally, I would cache the data somehow, so that it could be incorporated into every site build – and at the same time, it should all be fetched only once a day.

What I implemented is a separation into two steps: 1) fetch external data, and 2) build the site, passing the external data as an argument. These two things are now completely independent. The data fetching step takes a list of sources that should be fetched (:webring1, :bookwyrm, :spotify, :forgejo), plus configuration for each of those sources (RSS feeds for the webring, bookwyrm and forgejo instance, spotify user and secrets, ...). It fetches the required data from those sources and turns it into a fairly uniform format, which is then merged into a (possibly already existing) EDN file. The Clojure code for this is approximately 100 lines long. So, this task collects and/or updates the external data for my blog, and this data is ingested by the site-building step (you just give that step a file name). This way, I can fetch the data once per day and then simply reuse the resulting file for each build that day, which makes the build consistent, but also much faster.

To do the periodic data and site update, I use a cron job on my own VPS, and then serve the file containing the most recently fetched external data via HTTP from there. This is where the "no CI" thing falls apart a little bit: running something on a schedule cannot reliably be done on my local machine. Instead of using a VPS, I probably could have implemented this as a CI job that simply runs my bash script on a daily schedule and uploads the data file somewhere. Since I already have a VPS though and I don't need tight integration with my Git repository to perform this task, I decided to set it up the way it's set up now.

The magic of local execution

So, in order to build and publish the current state of my blog to Codeberg pages, I can simply run ./deploy-pages.sh now. Even better: I can have this happen automatically on every push by registering this script as a pre-push hook in Git. My pages deployment is therefore updated every time I push a change... and it all runs instantly on my own computer (🤯).

Meme: Friendship ended with GitHub Actions, now bash is my best friend

Okay, maybe not best friend

I won't pretend that bash is particularly easy, or safe, or even good. There is an astonishing number of pitfalls. For example, one I ran into myself writing the script: did you know that bash2 disables errexit (fail fast) behaviour in functions and subshells as soon as you call them in an if statement? No? Well, it does, and I still have no idea why.

$ ( set -e; false; echo hello ) # exit 1
$ if ( set -e; false; echo hello ) ; then echo world ; fi # exit 0
hello
world

Here's two things that make bash convenient, though:

  • it's available almost anywhere
  • calling external programs doesn't involve any ceremony, you just write out the command

It's basically just writing out what you would normally type command by command in your terminal. Thus, if the point of your script is to simply automate a sequence of commands, bash's offer is good enough. And this was the case for me. If you have an alternative shell scripting language that you think provides a better way to "automate sequences of commands", do let me know, I'd love to learn something that isn't quite as broken as bash. Until then, I will keep watching people use bash as a device for self-torture.

A complete tutorial

To play around with Codeberg pages and static site generation, I wrote a tutorial that is set up using its own instructions. It explains how to get started on Codeberg, how to set up a repo for a static site generator and the automatic deployment stuff. I think it turned out pretty well. (I also used it as an opportunity to finally try out a "classless" CSS library and it worked decently well, with some adjustments)

I used eleventy to create the page for this tutorial, and it was a great experience. If I have another one of these "I want to put a single web page for something online"-moments in the future, I'll probably use eleventy again. It's also much faster than cryogen, the SSG powering this blog, so updating the tutorial website is incredibly fast, compared to CI:

GIF of me typing "git push -f" in a terminal in the pages-tutorial repo, then seeing a bunch of output from the pre-push hook that deploys to Codeberg pages. The whole thing is done after about 3 seconds.

Check out my tutorial if you're interested in setting up something like a blog or any other statically generated website on Codeberg pages! If you have a little bit of bash knowledge, you can definitely adapt the script to work with other "pages" services like GitHub, GitLab, SourceHut, ... as well.

That's all folks!


  1. I've switched from openring to frenring (my fork) for my blogroll recommendations, mainly because frenring only returns data instead of rendering HTML, but also because I like its randomisation approach a bit better.

    ↩
  2. technically POSIX shell, but, you know, whatever.

    ↩
Tags:

Comments

Comments for this post are available on chaos.social. If you have an account somewhere on the Fediverse (e.g. on a Mastodon, Misskey, Peertube or Pixelfed instance), you can use it to add a comment yourself.

Posts from my blogroll

This is probably the most I will ever pretend

Anything can be anything! Until the next game starts, of course.

via Cassidy Williams August 30, 2025

Trump Jr.-advised prediction markets invite bets on president’s demise

President Trump’s deregulatory agenda emboldened prediction markets to push boundaries around permitted event contracts. Now sites advised by his son are allowing bets on his death.

via Citation Needed September 02, 2025

I gotta make music

all day every day

via Todepond dot com September 02, 2025

Generated by frenring