It’s good to remember that every time Mark Zuckerberg claims that he founded Facebook in order to connect people or build communities, he is somehow forgetting that he first created the site in order to enable himself and his fellow dorm-dwellers to rate Harvard’s young women on their looks. But then, Zuckerberg has never been the sharpest tool in the box. He once said that Facebook wouldn’t interfere with Holocaust-denial on its service, because it was hard to impugn people’s motives for denying the Holocaust, before a couple of years later announcing that his “thinking” on the matter had “evolved” and Holocaust denial was now frowned upon. Well, evolution does work slowly.
But as Charles Arthur’s coolly prosecutorial book shows, social-media algorithms don’t just allow people with nefarious interests to get together: they perform as active matchmakers. “Facebook was hothousing extremism by putting extremists in touch with each other,” concluded Facebook’s own internal investigations in 2016. Not only that, Facebook was “auto-generating terrorist content”: its “machine learning” systems created a “Local Business” page for “al-Qaida in the Arabian peninsula”.
Electronic social networks began with the dial-up bulletin boards of the 1980s, when hipster clubs such as the WELL (“Whole Earth ‘Lectronic Link”) grew to be praised as model communities of the utopian future in books with the word “Cyberspace” in their titles. “A few things about the WELL’s discussion system would become axiomatic for almost all future systems,” Arthur points out. One of them was the fact that postings did not naturally expire. That architectural choice led directly to the modern phenomenon of “offence archaeologists” combing through people’s Twitter histories in order to publicly shame them for sins in the deep past, as recently happened to the cricketer Ollie Robinson.
The modern design of social media also psychologically encourages bad behaviour, including mass aggression. Chris Wetherell, the man who built the retweet function, now regrets doing so. And of “quote-tweeting”, or retweeting someone’s post with a (usually denunciatory) comment, Arthur writes amusingly: “The effect often resembled someone walking out on to a balcony to an adoring crowd and announcing, ‘You’ll never guess what this idiot just said on the telephone! Let me read it back to you!’”
Facebook, meanwhile, muscles into developing countries and strikes deals with mobile carriers to make its platform (but not the wider internet) free on phones. The result is that digitally inexperienced users assume that Facebook itself is the internet, and that everything on it must be true – a confusion Facebook actively encourages by terming its scrolling list of posts a “News feed”. The results can be alarming, as Arthur shows in a chapter about Myanmar, where a UN fact-finding mission found in 2018 that Facebook had “substantively contributed to the level of acrimony and dissension and conflict … within the public”.
The deeper structural problem is that Facebook, Twitter and Google can hardly take consistent action over “misleading or unreliable” communications as long as they depend for their profits on advertising, the whole art of which is to be as misleading as possible within the confines of the law. As long as Facebook is not prepared to fact-check adverts (and, as anyone who uses it knows, it is infested with cynical pushing of quack cures for cancer and other dangerous garbage) it can’t be expected to fact-check political campaigns. Satirical researchers have found that, as an advertiser, it is possible to pay Facebook to target particular potential customers who have demonstrated an interest in “pseudoscience” or “vaccine controversies”.
And yet, as Arthur shows, the social-media giants could do more if they wanted to, as proved by their interventions in public messaging over Covid-19 harms and risks. (The writer Naomi Wolf was recently suspended from Twitter, having helped spread swivel-eyed nonsense about how standing near vaccinated people can make you sick.) Surveying ideas for tighter regulatory control in his conclusion, Arthur also recommends that we “make content sharing a little less easy”, and perhaps even break up the giants, just as the Standard Oil Company was broken up in 1911.
I was left unsure by the titular phrase to describe the havoc that social media is wreaking upon our lives. Warmth, after all, has long been a social metaphor for something desirable: as when people speak warmly, or enjoy a warm friendship. (Indeed, according to some psychological research, loneliness makes you feel cold, and being cold makes you more lonely.) Perhaps, just as some now prefer to use “global heating” or “climate crisis” in the atmospheric context, we should think of social overheating or social boiling. In the mean time, feel free to share this article on Twitter.