A Facebook bug led to increased views of harmful content over six months
Alex Heath, writing for The Verge
A group of Facebook engineers identified a “massive ranking failure” that exposed as much as half of all News Feed views to potential “integrity risks” over the past six months, according to an internal report on the incident obtained by The Verge.
The engineers first noticed the issue last October, when a sudden surge of misinformation began flowing through the News Feed, notes the report, which was shared inside the company last week. Instead of suppressing posts from repeat misinformation offenders that were reviewed by the company’s network of outside fact-checkers, the News Feed was instead giving the posts distribution, spiking views by as much as 30 percent globally. Unable to find the root cause, the engineers watched the surge subside a few weeks later and then flare up repeatedly until the ranking issue was fixed on March 11th.
Facebook has identified a “massive ranking failure” that exposed up to half of all News Feed views to “integrity risks” since last October, per internal report I obtained. Impacted distribution of misinfo, violence, nudity, and even Russian state media https://t.co/1AFgO0JSVN— Alex Heath (@alexeheath) March 31, 2022
It really does sound like a bug, and some bugs really are devilishly tricky to track down and fix. But it seems a bit odd that it took Facebook six months to fix this one, given how intense the scrutiny of the company has gotten for the very problem this bug made worse.
One of the things I think about a lot is why problems such as this one have basically no repercussions for the companies that create them. In this case, this bug was only made public because someone leaked the internal report, and its possible consequence was significant — Heath writes that it “impacted up to half of News Feed views over a period of months”. But it does not matter, not really. Facebook’s reputation is in the tank and it will not lose users because of this, nor will advertisers pull funds. It does not matter that Facebook increased the spread of bullshit instead of responsibly slowing it, apart from in all the subtle ways it does matter that its massive user base was increasingly misinformed.
Facebooks problem is not a failure of technology, nor a shortcoming in their AI filters. Its problem is its shitty business model. Profits chiefly from engagement and virility. Fuck Facebook!