It’s not (always?) the algorithm
ThoughtsOne of my favourite Techdirt writers Mike Masnick thinks we’re barking up the wrong tree when we wholesale dismiss social media algorithms:
But underlying all of this is a general opinion that “algorithms” and “algorithmic recommendations” are inherently bad and problematic. And, frankly, I’m confused by this. At a personal level, the tools I’ve used that do algorithmic recommendations (mainly: Google News, Twitter, and YouTube) have been… really, really useful? And also pretty accurate over time in learning what I want, and thus providing me more useful content in a more efficient manner, which has been pretty good for me, personally.
I recognize that not everyone has that experience, but at the very least, before we unilaterally declare algorithms and recommendation engines as bad, it might help to understand how often they’re recommending stuff that’s useful and helpful, as compared to how often they’re causing problems.
I’ve had the same positive experience; I wouldn’t know half the engineering YouTube channels if the algorithm hadn’t recommended them to me.
I’m willing to entertain the idea that some of this frustration is misplaced, and that it’s the fault of things like the financial model of social media platforms, or even us as willing participants. There are plenty of other issues at play here too. But I don’t think it’s confusing why people also have reservations about algorithms in general. There’s plenty of evidence, from search engine “bubbling” to radicalisation, that they can cause problems.
Transparency is the other big issue. The world’s biggest search engine likes to talk up how open they are, and spruik their standards cred (or at least, they used to in a pre-AMP world), but all of these algorithms are black boxes. We can only speculate on their internal machinations, and judge them on their output. They work well for some of us, but at best they’re a mixed blessing, and I think it’s fair to question their effect at scale.
Mike points to Facebook having made more money since they stopped using specific news algorithms, which I’ll have to take their word on. But social media algorithms in general do seem to favour clickbait and polarising views, which leads to perverse incentives for creators.
Ann Reardon of How to Cook That continues to provide an interesting perspective here, precicely because she isn’t technical. You and I can attempt to think about how these systems work, but it’s also good to hear how people in the real world (especially those who create media) live with these things. Algorithms shouldn’t punish people for being honest.