This article is more than 1 year old

The revolution will not be televised because my television has been radicalised

When recommendation engines promote misinformation during a pandemic, it's a matter of life and death

Column My television is trying to radicalise me with an endless stream of recommendations to watch videos from a mainstream media outlet that deliberately inhabits the outer reaches of the political opinion spectrum.

Its content are just not my cup of tea. I find them errant, offensive, braying, thoughtless and, well, just stupid.

I know we're all supposed to be working on ways to, in the words of the President-elect 'See one another', and I know that a broad media diet is healthiest way to live in the world. But only generous funding from a like-minded think tank makes it possible to publish more radical content than the stuff my TV wants me to watch.

So I'd like my TV to stop this, but I can't figure out how.

I don't think the problem is with me; I'm fairly technically competent, understand how recommendation algorithms collect their data then feed that back to me, and I can also Google around for an answer to a question – in this case, how do I prevent this particular media source from showing up in my recommendations?

Given that we increasingly rely on algorithms to feed our attention with things that we need to know, we need to have a lot more transparency and control over these algorithms

That sounds so simple, and according to the Internet – that eternal font of wisdom – the answer is equally straightforward: just block the source, and voila, your recommendations will sort themselves out. So I did that.

And nothing changed.

Before I tried blocking this source, I'd gone through the laborious process of labeling every one of its videos as not to my tastes every time they appeared as recommendations.

That didn't work, either.

So this recommendation system is clearly flying according it its own desires, while blithely ignoring mine. My television has been hijacked. I don't think this is a bug – but that's not the same thing as saying it's intentional. Instead, I suspect this is the emergent outcome of a confluence of decisions made to square the needs of both commerce and user experience.

Somewhere along the line someone decided that this media source deserved equal billing with less feral voices, and that in the interests of perceived fairness all of them should be presented as equal and equally appropriate for viewers. That this system doesn't allow you to pick and choose your news sources may give it the illusion of fairness, but all news sources are not created equal. They have different values, different editorial standards – and different audiences. Chalk and cheese not in any way equivalent.

cassandra and her twin android - looking into a glowing glass orb

Machine-learning models trained on pre-COVID data are now completely out of whack, says Gartner

READ MORE

And there's another dimension to this: this source actively promotes disinformation so it's likely that someone consuming its output actually becomes less well-informed about the state of the world, a precarious state of affairs that has become increasingly visible during the pandemic.

And while much debate about media today is a culture wars skirmish or an attempt to claim political high ground, we all now face life-and-death decisions concerning public health, vaccinations, and economic recovery. To add a false voice right now seems both a failure to read the room, and potentially quite dangerous.

So what do I do about my television? It's just a window onto a vast cloud of servers and content and recommendation algorithms. It can't fix this problem. Neither, it seems, can I.

And it's not just me. One of my friends found his partner – newly arrived in this country, and not yet across the various polarities in Australia's media landscape – watching video after video from this provider, as if it was all perfectly normal. That's the trick here: a wolf in sheep's clothing, looking just like all the other grazers – until it takes a bite out of you.

This seems to be happening at scale in many places – an invasion by algorithm, something that has not been discussed, or debated, or put to a vote. Given that we increasingly rely on algorithms to feed our attention with things that we need to know, we need to have a lot more transparency and control over these algorithms. We need to be able to block – and to permit. And we always, always need to know why something is in our feeds. We need to know the chain of logic and association and algorithm and corporate interest that presented us with this steaming pile of video content.

If we don't, we'll soon find our minds stuffed full of it. ®

More about

TIP US OFF

Send us news


Other stories you might like