Donald Knuth was onto something when he coined the phrase premature optimisation. He was talking about software development, but the idea that its counterproductive to streamline a process before you’ve understood it is applicable to a range of other contexts. I’ve noticed myself doing it when I even choose what to use.

A few years ago I learned about ack, an alternative to grep for searching text files. I loved it! But then I read another tool was faster, so I switched to it. It didn’t have some of the useful syntax or features, but I learned to make do.

I recently started using the ranger console file manager. It’s written in Python, so I assumed a couple of alternatives would be better given they were written in Go and Rust. They were certainly faster, but they lacked some of the features I used. I learned to make do.

A long-running personal project had me using OPML, an XML-based file format for outlines. RDF schemas and data structures are more feature complete and don’t carry OPML’s limitations and quirks, so I went down the rabbit hole of using it for a personal project. I didn’t have fun.

I’ve done this with Perl modules, text editors, cameras, coffee machines, even apartments. I optimised for a specific quantitative metric in each case, to the detriment of others that would have made my life easier. Did the added performance, features, megapixels, or steam pressure help me? The fact I’m posing it as a rhetorical question is only slightly less redundant than this sentence pointing it out.

Ack, ranger, OPML, and my Aeropress are slower or more limited when compared to tools in a similar class. But what they offered me in return was something greater.