Technically, Facebook can’t suspend people’s accounts just for sharing 50-plus false, sensational, or clickbaity news articles per day. It doesn’t want to trample anyone’s right to share. But there’s nothing stopping it from burying those links low in the News Feed so few people ever see them.
Today Facebook announced an algorithm change that does just that. It detects links that are aggressively shared by suspected spammers, and deprioritizes them in the News Feed. Facebook’s research shows that these links “tend to include low quality content such as clickbait, sensationalism, and misinformation”, so showing these links less prominently could improve the quality of what people see on the social network even if this change doesn’t analyze the specific content behind the links.
If you just like to post a lot on Facebook, this shouldn’t affect you. Pages are free to post as much as they want since this change only scrutinizes individual user accounts. And Facebook says only Pages that depend on these spammers for traffic will see a drop in their distribution.
That’s because Facebook’s VP of News Feed Adam Mosseri tells me it’s targeting “People who are sharing deliberately and purposefully . . . who purposely share vast quantities of things. 50-plus posts a day. Very significant statistical outliers. People having an outsize impact, inundating the News Feed with public content because they have some sort of agenda.”
Facebook began its ongoing battle with clickbait back in 2014 by demoting links to websites people opened and immediately jumped back to Facebook. Since then it’s changed the algorithm to show fewer hoaxes, trained AI to weed out clickbait and spam, combatted fake news with reporting options and fact checkers, demoted links to crappy ad-filled sites, and expanded its attack on clickbait to nine more languages.
You can see all of Facebook’s dozens of algorithm changes in our Ultimate Guide To How Facebook News Feed Works
“We’re trying to do as much as we can to get false news, clickbait, sensationalism off our platform” Mosseri insists. “We’re a platform that tries to empower people to share. And so these people are in a grey area. They’re spamming but not necessarily violating any specific policies that we have so we think this is the right type of approach.”
If Facebook can banish this kind of content from its feed, people will spend more time reading, be more confident about what they click, and Facebook can better fulfill its new mission statement to “bring the world closer together.”