Last month Bruce Schneier summarised Apple’s ill-conceived iCloud image scanning technology thusly, and I haven’t been able to get it out of my head:

This was a bad idea from the start, and Apple never seemed to consider the adversarial context of the system as a whole, and not just the cryptography.

Whether you agree with Bruce’s assertion, the outcome is the same. And I see the same train of thought (or lack thereof) that he’s describing everywhere in IT. There’s this insular, prevailing attitude that you can address the tech, and people will come. Or worse, that you don’t need to consider externalities at all, because the tech can justify itself and stand on its own.

So much of the Internet, from tech journals, news sites, social media, and aggregators like Hacker News, Lobste.rs, and Reddit, spend their time talking about the technical merits of a system, to the point where ethical, moral, or business discussions devolve into technical nit picking and yak shaving. I liken it to not seeing the forest for the trees, and it’s beyond tedious.

(It was the other reason aside from spam that I turned off blog comments a decade ago. We’ll have a cure for cancer one day, and a kiasu will complain the peer-reviewed paper didn’t have its LaTeX fonts exported properly).

Those of us in this industry don’t have the luxury of theoretical physicists or Scott Morrison’s speech writers. We have to live in the real world, where our technical decisions have an impact on people’s lives. Burying one’s head in the sand and falling back on a technical detail is no longer tenable.

What concerned me about Apple’s decision, even if suspended, was that even a layperson could see the incoming ethical trainwreck and threat to people’s safety it represented. For one of the only large companies talking seriously about privacy, it represented a breach of trust. Few things are as easy to lose, and hard to earn back.