Washington - If you use Facebook, you’re probably familiar with the sort of post shown in the image on the right. It’s called “like-baiting”, and it has run amok on the social network in recent years.
That’s because the software that determines what we see at the top of our Facebook news feed is tuned to show us posts that lots of other people have liked, shared or commented on.
Corporate brands, media outlets and other “content producers” have learned this, and many now use like-baiting to game the system.
People’s responses to like-baiting fall into three categories.
The first group resents the blatant manipulation and intentionally avoids clicking. The second group sees nothing wrong with this sort of post, happily clicks like, and moves on to the next cute bunny picture. (Note: This group is composed primarily of 8-year-olds pretending to be 13.)
The third group resents the blatant manipulation, but clicks like anyway – just to be safe, I suppose, or maybe because LOL nothing matters.
It’s that third group that’s been ruining things for those of us in the first group. That’s because Facebook’s news-feed algorithms have a hard time telling the difference between a genuinely enthusiastic like and a slightly exasperated “oh fine, why not” like. But that may finally be changing.
In a blog post, Facebook has announced that it’s tweaking the news feed to crack down on like-baiting, along with other forms of “spammy” content. So when a post explicitly asks people to like, comment or share it, Facebook’s algorithms will punish the offending page by showing its posts to fewer people.
“This update will not impact pages that are genuinely trying to encourage discussion among their fans,” Facebook assures us. Instead, it “focuses initially on Pages that frequently post explicitly asking for likes, comments and shares”.
Facebook will also be taking aim at pages that routinely repost old photos and videos in a bid to squeeze more likes out of them. In addition, it will downgrade posts that Facebook users click on but then do not choose to like or share. Facebook assumes that these are cases of “click-bait”, in which a misleading headline or teaser photo leads people to a page that doesn’t actually deliver worthwhile content. For instance, Facebook explains: “Often these stories claim to link to a photo album but instead take the viewer to a website with just ads.”
You’d think those of us in the media would greet these changes with a sigh of relief. After all, less spam in users’ news feeds presumably means more room for content that’s legitimately interesting or worthwhile.
And making the system harder to game should ease some of the pressure that publishers feel to engage in manipulative tactics when they really do have something worthwhile to say.
This won’t spell the end of provocative headlines, of course. And it probably won’t do anything to address online media’s rampant BS problem. (If Facebook’s programmers ever figure out a way to punish posts for being wrong, then we’ll really have something to celebrate.) Still, this feels like a step in the right direction. And it’s part of a broader campaign by Facebook to show users more stuff they actually like.
A few pundits quickly hit on a counterintuitive take on Facebook’s move. Spam is good, argues GigaOm’s Mathew Ingram, spinning the like-bait crackdown as an example of the irksome paternalism inherent in Facebook’s news feed.
“Facebook is essentially saying ‘We’re not going to pay attention to what you do – we’re going to purify your news feed for your own good’.”
This is a bizarre argument from a commentator who usually makes a lot more sense. Either Ingram thinks that Facebook should show every single post to every user – essentially turning it into Twitter, which is a far less popular service partly for that very reason – or he thinks that Facebook should indeed filter your news feed, but only in a way that’s easy for spammers to game.
Either way, that take is wrong. Spam is bad, and Facebook is right to try to give us less of it. – Slate / The Washington Post News Service