Meta’s Dereliction

This will inevitably lead to very bad things.

In Facebook’s earliest days, it enabled college students to connect with their classmates. Eventually, Facebook grew to become a way for friends, family, and acquaintances to keep in touch. Back then, your newsfeed contained updates from and about people you knew, and most crucially, it had an end. It was possible to check in for a few minutes to see what was new with the people you cared about, and then move on with your day, or even your week.

That all changed when Facebook switched to an algorithmic feed. This shift from “social networking” to “social media” was made by all the major platforms, in an effort to increase “engagement” and thus sell more ads. Facebook’s feed went from providing news about people you knew to showing anything that might catch your eye and keep you there. Meta would be quite content to have you endlessly scroll their river of content for 18 hours a day. That’s the logical endpoint of a system slavishly devoted to capturing your attention.1

With an algorithm built to capture eyeballs, clickbait and ragebait bubble up, intriguing misinformation gains traction, and the Overton window shifts toward extreme and ill-informed opinions. There is a dire need for moderation to combat these issues. But yesterday, Meta announced they would cease fact-checking on their platforms. As one Wired headline put it, “Meta Now Lets Users Say Gay and Trans People Have ‘Mental Illness’”. There’s a feature.

Casey Newton has a good post covering this change in depth. My takeaway is that newsfeeds that were already pretty bad are going to get worse. I would go so far as to suggest that algorithmic newsfeeds probably shouldn’t exist at all. Alas, they do, and they’re popular with users.3

Algorithmic newsfeeds can’t be uninvented, and it seems unlikely they’ll be regulated out of existence. But if we must accept the presence of these algorithms, we can impose a duty of care on their creators. Massive platforms can be required to work at avoiding things which can foreseeably harm others. We know that the amplification of hate speech can do exactly that. From Newton’s aforementioned piece:

In 2018, the United Nations found that Facebook and social media had played a key role in accelerating the Rohingya genocide in Myanmar. “Facebook has been a useful instrument for those seeking to spread hate, in a context where, for most users, Facebook is the Internet,” the UN concluded.

That is surely not an isolated incident, and I’m concerned for the future. I’m also left to wonder what Mark Zuckerberg, the recipient of the world’s first rat penis transplant, hopes his legacy will be.


Footnotes:

  1. The reason they wouldn’t want you on there for 24 hours a day is that if you die from lack of sleep, your usage will drop way down.2 ↩︎

  2. On the other hand, if hell is real, it’s probably 24/7 Facebook scrolling, so maybe usage increases after you die. ↩︎

  3. Then again, so is heroin. ↩︎