Accounting for empathy in automated systems


You know the online refrain facts over feelings? Aside from unintentionally belying a lack of emotional intelligence, it posits that a perfectly logical world without human affordances is possible, expected, or desirable. The first two are false; we’re not robots. The latter belongs in the purview of psychopaths and handsome fictional detectives.

These considerations are only becoming more important as we delegate more decisions to machines. I’m relieved to see more discussion surrounding how such systems were programmed, what datasets they were trained on, and the overt and subconscious views of their developers, regardless of intentions or motives.

As the industry is starting to realise (albeit at a glacial pace!), algorithms don’t remove biases, they entrench them. Anyone who’s left not recognising that either hasn’t dealt with someone on the receiving end of such automated decisions, or needs to broaden their professional horizons. The same goes for those pushing so-called smart contracts, which are neither.

You can probably tell I’ve been thinking about this a lot lately! A perfect example came up in a newspaper last weekend; an article about a horrific road accident included an inline advertisement for a car company. I’m sure you could think of plenty of other examples where boilerplate and automated systems conflict with visceral human emotions like this.

For those in the back, or who have emailed me obtuse comments in the past: you don’t want to be told about cars if your loved one has just died in one.

An online advertising insider (“adtech”) once told me that such systems include “brand safety” mechanisms… the name of which speaks volumes about the industry’s priorities, but that’s for another post! But it at least acknowledges these facts:

  • Ads are based on keywords.
  • A negative article will still mention cars.
  • Ads are bid on, and can be considered pseudorandom.
  • People shouldn’t let advertisements affect their emotions.

Which conflict with these feelings:

  • Negative associations between car fatalities and cars.
  • Ill will against the publisher, writer, and advertiser.
  • A backfire effect, where shoppers won’t want to buy from them.

Time was newspapers would have editors who’d see unfortunate associations, and exercise their discretion to ensure they didn’t go to print. This hasn’t been true online for a long time now.

I’m not sure how we scale empathy and human oversight into automated systems… part of me fears the horse has long since bolted. But we should spend much more time thinking about how these affect people in the real world. Sometimes it’s worth reminding ourselves that the computers aren’t a means to an end; they’re supposed to be serving us!

Like an economy, now that I think about it.

Author bio and support


Ruben Schade is a technical writer and infrastructure architect in Sydney, Australia who refers to himself in the third person. Hi!

The site is powered by Hugo, FreeBSD, and OpenZFS on OrionVM, everyone’s favourite bespoke cloud infrastructure provider.

If you found this post helpful or entertaining, you can shout me a coffee or send a comment. Thanks ☺️.