Public Digital isn’t usually in the business of making and running software, or of creating our own tools where simple cloud-based alternatives exist. But a few years ago, to support our Signals publications I decided to exercise those muscles and did just that. This is Public Digital’s URL shortener, a tool which preserves user privacy and is open source. I realised we’d never really explained it, so here’s the background.

Most of PD’s writing is born on the web. It’s full of links to examples, more explanations and other interesting stuff. That’s great when we’re writing for the web, but doesn’t work so well when we’re printing books and people have to type in URLs instead of clicking links.

The answer we came to–as many others have before us–is to use a URL shortener. A service that will let us turn, say, https://en.m.wikipedia.org/wiki/Conway%27s_law into the much easier to type https://pdlink.co/conway.

In doing that we had three things we wanted to achieve beyond simply providing URLs that were easier to type and looked better in print. We also wanted to:

  1. use our own domain name, so that we could change the underlying URL shortener service we used without breaking the links we’d published
  2. preserve our users’ privacy as much as possible, giving us some sense of whether people were using it but without helping a third-party track people around the web
  3. give ourselves as little admin and operations work to do as possible

Our assumption was that we’d buy a third-party service that let us host our own domain name and all would be good. But that fell apart at goal number 2. Maybe there’s a great URL shortening service out there that provides for privacy, but if so it’s not easy to find. If you find it, please let me know.

So there was a trade-off to be made. Should we compromise on #2 or #3?

Building a custom application

A few years ago, the approach we would have needed to take would be to build our own application to handle the redirects. The code for that is pretty simple and easy to get started with. But it would need a server to run on, and ideally some sort of cache or content delivery network to make sure it stays fast and to make it a bit more reliable.

That would immediately give us some operational work to do. Not only would we need to set up those pieces, but we’d need to make sure we were keeping each of them up to date with security fixes, and so on.

We’d then need some sort of tool for people to update the URLs, which in turn would require some way of checking whether that person was authorised to do it. And we’d need some sort of monitoring tool, a way to notify people if something broke, and of course people who knew what to do.

In reality, none of that is all that complicated, but it requires a lot more attention and operational work than it might seem to do it well. That’s not what we’ve set Public Digital up to do, it’s one more thing to keep an eye on, and it’s not big enough for it to be worth paying someone else to do for us. But on the other hand we’d rather not ask our users to sacrifice a little privacy just to follow a link we’d shared.

How we built it

I decided to look into how easy it is to provide such a service using Amazon Web Services (other cloud providers are available, AWS is just the one I’m most familiar with), and it turned out it’s become incredibly easy.

It turned out that it could can be done without having to run any custom software at all, just by wiring together Amazon’s “load balancer” (CloudFront, a service that manages and directs web requests) and file storage service (S3). Providing Amazon doesn’t make any major changes to how they work, there’s no maintenance required, just a tiny bit of custom code to get things set up. [Here’s the documentation for the feature of S3 we used]

For a while that was fine. I did a bit more work to make sure I wasn’t the only person with access to the infrastructure or code, just in case, and it worked pretty well for a couple of years.

A bit more work to make things simpler

Around the time I was setting up the URL shortener, GitHub Actions emerged as a way to make it simple to run code when certain events happen. GitHub is the company whose tools we use to manage the small bits of code we have produced, and some of our reports.

What did that mean for us? It meant that we could set it up so that when anyone edits the file containing the URLs we want to create, Github will run the code for us to put them in place. That means that all we need to do is give one of our team access to update that file, and everything else falls into place. Once you have access, it’s about as simple as updating and saving a spreadsheet.

There are, of course, still some things that can go wrong and require some technical intervention, but we’re using some very simple building blocks from two well established companies to achieve our goals.

The result? We’ve got a neat little URL shortening service that does what we need, and I don’t need to worry much about keeping it working. If you want to try it for yourself we’ve made the code available.

Tools need teams and skills

No-one at Public Digital is paid to spend their time writing code, but within our team we have people who are close enough to that that we can keep track of how “what’s easy and what’s hard” is changing.

That’s not just how certain technology is becoming commoditised, but how the constant process of reshaping technology that goes with that changes the options available to us. I wrote about this in more detail in Why internet-era CTOs hire developers.

Having enough of those skills in the organisation is essential. It lets us check our assumptions before worrying too much about trade-offs in technical decisions. It also lets us spot how some of the things we were previously reluctant to do might have changed, and to save a lot of worry by just doing a quick experiment.