The company’s own research reveals that Instagram harms teens, that it can’t control anti-vax misinformation, and that there is a secret double standard for VIPs. In short, the problem with Facebook is Facebook
‘Working for Facebook these days must be a crushing moral and social experience.’ Photograph: John G Mabanglo/EPA
For years, Facebook has faced torrents of criticism from human rights groups and academic researchers, who raised alarms about the ways that the most pervasive digital social platform in human history distorts our world and promotes destructive behavior ranging from eating disorders to genocide. In response, Mark Zuckerberg and his staff have frequently pronounced commitments to reform.
While many of those pledges and predictions seemed to have been sincere, it turns out that not only have the architecture and incentives built into Facebook itself undermined the biggest efforts to fix the service, but that Facebook’s own research staff have informed top leadership of the company’s stunning failures.
This week the Wall Street Journal has run an eye-opening series of articles, based on internal studies and documents leaked by Facebook researchers, revealing just how duplicitous and/or naive Zuckerberg is about his own company and its influence on the world.
In one piece, the Journal revealed that Facebook maintains a private registry of very important people, including celebrities and politicians, who are exempt from the strict content-posting rules that govern the rest of us.
A second article was even more powerful in its indictment of Facebook and its leadership. The Journal showed that Facebook’s own researchers had documented the psychological dangers that Instagram, which Facebook owns, poses to teenagers, especially teen girls.
Here’s how Facebook’s internal documents and presentations put it: “We [Instagram] make body image issues worse for one in three teen girls,” and “Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.” Internal studies showed that, among teens who reported suicidal thoughts, 13% of British users and 6% of American users traced the desire to kill themselves to Instagram.
So Facebook’s leaders knew their service was harming people yet refused to publicly acknowledge it or do much about it. Clearly, the health of teenagers does not concern those who run that company. In March, Zuckerberg told a congressional hearing: “The research that we’ve seen is that using social apps to connect with other people can have positive mental-health benefits.” He offered no such research. And he presumably knew that the truth was just the opposite.
In the latest revelation, the Journal reported that after Zuckerberg promised to promote legitimate information about Covid-19 and vaccinations, Facebook users who oppose vaccines or doubt the threat of Covid-19 did what Facebook users have been doing for years: they flooded the comments under otherwise factual posts with misleading, false, and destructive health misinformation. The result was cacophony at best, successful propaganda at worst. The swarm of anti-vaccine commenters, who are more highly motivated than most of us who trust medical science, undermined the pro-vaccination goal Zuckerberg had set.
The Facebook researchers who warned of this problem understood what the company’s top leadership seems to ignore or deny: the problem with Facebook is Facebook. Facebook is designed to prompt engagement and reward engagement. The comments are its currency.
Posts that generate a lot of comments get promoted by the algorithms, but those comments themselves become part of the overall message of the post. Arguments break out. And the more people bicker in the comments, the more prominent the post and the comments become. That’s why you can’t argue with evil, ignorance or craziness on Facebook – it’s counterproductive. Unfortunately, posting reasonable, solid information is also, in a sense, counterproductive: that kind of information receives almost no readers because reasonable posts do not generate irrational or unfounded responses – or they attract destructive or toxic responses, and those comments morph the message as Facebook picks it up and sends it to users’ newsfeeds.
Comments matter. Along with shares and likes, comments drive “engagement” with posts and profiles. Everything at Facebook is designed to maximize engagement – even more than revenue. If the company can get its 3 billion users to interact with content as long and as often as possible, then revenue will take care of itself. As the sociologist Jeremy Littau has argued, we need better empirical analysis of the effects of Facebook comments on the overall communicative influence of Facebook. We are only just beginning to grasp the power of comments on Facebook’s system of algorithmic amplification and on users.
Comments within posts that only obliquely concern Covid-19 or vaccines can have profound influence as well. A post in a Facebook group devoted to parents or schools could, say, generate anti-vaccine comments that attract significant attention and engagement. A post about professional American football, where vaccination policies have sparked blowback from players and fans, might become a hotbed of anti-vaccine propaganda. But most researchers and Facebook’s own content moderation systems don’t seem very concerned with comments. Even a cursory glance at a Facebook page should convince us to pay attention to comments.
Many critics of big technology companies have been cheering as their disgruntled labor forces have risen up to challenge the rich, (mostly) white, (mostly) American men who design and run Facebook, Twitter, Google, Microsoft, Apple, Amazon, Oracle, Palantir and others. Labor uprisings have forced bosses to confront their poor treatment of women, their complicity with the military and intelligence establishments, the general threat of surveillance and the political affiliations of the companies and their leaders.
This week we saw the first of what will probably be a flood of internal documents and studies blowing open to the public just how bad things are inside the tech industry.
For years, the myth that these companies were making the world “better” served as a kind of non-monetary wage for workers. They could sleep well and smile in the mirror by believing that their services and devices were improving the human condition.
While many of us saw through that nonsense years ago, technology workers took a bit longer. But now they’re clearly ready to revolt, out of sheer disgust. Working for Facebook these days must be a crushing moral and social experience.
That said, the latest revelations once again show that there is not much hope for reforming the platform by changing its culture or design. A world with Facebook is going to be crueler, stupider, and more deadly than one without Facebook. But it looks like we’re stuck with it.
Leave A Comment