Last Saturday I quoted a journalist saying social media algorithms aren’t “inherently bad and problematic”. I mentioned that despite being useful for me too, that it shouldn’t be confusing that others have reservations:
There’s plenty of evidence, from search engine “bubbling” to radicalisation, that they can cause problems. Transparency is the other big issue.
My position has only solidified in light of Zuck’s Papers. I cannot overstate this: absolutely nothing his company has done has ever surprised me, and any surprised journalist should resign and go farm turnips. But it reinforces what we’ve long suspected.
Susan Benesch of George Washington University agrees, arguing in The Atlantic (paywall) that the companies themselves are opaque:
This decisions that their employees and their algorithms make about what to amplify and what to suppress end up affecting people’s well-being. Yet the companies are essentially black boxes.
Susan is working with other academic researchers on “initiatives that would guarantee the sharing of key information”. I’m relieved people are thinking about this; everything from our health to our democracies are at stake.
I’m not going to talk about the company’s renaming, or any of the furore or painfully unfunny memes that I wish people would stop sharing. I just want this despicable company to vanish up its own posterior, so we don’t need to waste any more mental CPU cycles on them. I’d rather eat turnips… which says a lot.