Ross: The Facebook algorithm is your friend
Facebook’s Vice President of Global Affairs, Nick Clegg, made the rounds on the Sunday shows over the weekend, promising that Facebook has gotten the message. To make sure that it’s not making your children miserable, the company is going to nudge them.
“We’re going to introduce something, which I think will make a considerable difference, which is where our system sees that a teenager is seeing content over and over again, and it’s content that may not be conducive to their well-being, we will nudge them to look at other content,” he vowed.
But in an ABC interview, George Stephanopoulos asked him – why does this have to be so complicated? Why not just drop the algorithms that are getting users addicted, and go back to simply displaying posts in the order they were posted?
At which point Clegg basically told George, you know not what you ask for.
“If you were to across the board remove the algorithm, the first thing that would happen is that more people would see more hate speech, more (not less) misinformation, more (not less) harmful content,” he cautioned. “Why? Because those algorithms are precisely designed like a great, sort of, giant spam filter to identify, and deprecate, and downgrade bad content.”
He’s saying despite the allegations that Facebook doesn’t care, the company’s algorithms have in fact been locked in hand-to-hand combat against the millions of online monsters who want teens to hate their bodies, and want their parents to hate their political opponents.
And if he’s right, it would mean that the real problem isn’t Facebook, or its algorithms, but the sheer number of evil content providers!
So what can we do about it?
Pre-Facebook, if you had evil content providers living next door, you could ignore them, or build a higher fence, or buy a pair of pitbulls to scare them into moving away.
But in your online life – distance is meaningless, and so, we are just going to have to figure what’s got so many people pounding away at their keyboards churning out whatever toxins will get clicks.
Maybe it’s this system of paying for content just based on the clicks!
I know that Facebook is happy to slap a label on potentially toxic stories. But suppose they also withheld the paycheck from the person who wrote the toxic story? Regardless of how popular it was.
Then Facebook wouldn’t have to be nudging people so much.
Listen to Seattle’s Morning News weekday mornings from 5 – 9 a.m. on KIRO Radio, 97.3 FM. Subscribe to the podcast here.
- Tune in to KIRO Radio weekdays at 5am for Dave Ross on Seattle's Morning News.