On an average morning scroll I might happen across any or all of the following: videos of an event last night that I wasn’t invited to, an infographic explaining gaslighting, a meme about Boris Johnson, a photograph of a destroyed town in Ukraine, an advert for period pants, a model in a bikini, the wedding pictures of a friend’s cousin, a debate about whether race is a social construct, a screenshot of Gordon Ramsay covered in flour that has been repurposed for a joke about cocaine, a children’s choir singing to raise money for ALS research, a cat falling off a fridge, an advert for online therapy, an image of myself from six weeks ago uploaded by a friend in a “photo dump”, an advert for sunglasses, a Love Islander’s renovated living room, a women’s magazine article asking me if I am in fact in a toxic relationship with myself, and a reposted video from Buzzfeed Tasty that involves pushing raw minced beef into bread dough, covering it with mozzarella and somehow ending up with a birthday cake.
This content overload represents one side of the dichotomy that characterises social media: it is both heavily curated and totally chaotic. Our feeds are supposedly personalised, and yet scrolling through them often feels like an experience in which we are totally stripped of our agency. No matter how much use you make of the privacy settings, mute buttons and comment filters, you cannot have complete control over what you see on social media. This has only become more pronounced as platforms such as TikTok and Instagram increasingly use algorithms to push sponsored content, popular videos, new features or types of content that tend to encourage engagement (videos featuring faces, for example). And it only takes one piece of content to alter your mood, change your mind or derail your day.
I have been thinking about this recently in terms of consent. Putting aside social media’s addictive nature, we can say that we all choose to log on every day with our own free will. But once we enter that surreal, alternate world – which Patricia Lockwood in her novel No One Is Talking About This perfectly describes as “the portal” – we are at the behest of someone, or something, else. Whatever the people we follow choose to post, and whatever the algorithm serves up to us, we witness. We may not pay attention to it but it’s there, in our minds, something we cannot unsee. We may have opted in to social media but we have not – could never have – consented to its specificities.
Last week a story was reported about a woman who unwittingly starred in a viral TikTok. She was sitting in a café in a shopping centre in Melbourne when she was approached by Harrison Pawluk, who is popular on TikTok. He asked her to hold a bunch of flowers for him while he put on his coat. Before she had a chance to give them back he walked away, leaving her with the flowers and an expression of shock. Over a few weeks the video accumulated nearly 60 million views.
Unlike Pawluk’s followers, the woman, named Maree, did not gush with emotion or enthusiasm. She simply felt violated, having not consented to be filmed, let alone go viral (not to mention taking the flowers under false pretence in the first place). She does not use social media herself; she asked if she was being filmed and was told she wasn’t. “These artificial things are not random acts of kindness,” she told ABC Radio Melbourne. “He interrupted my quiet time, filmed and uploaded a video without my consent.”
While it would be difficult not to sympathise with Maree on a personal level – Pawluk himself has since apologised for causing upset – filming other people in public places is not illegal. In a digital culture where everything can be turned into content and where our interactions with other people are often one step removed from reality anyway, it is not much of a stretch to see how a 22-year-old influencer wouldn’t see the problem with sweeping up a stranger into their never-ending stream of self-narrative performed online, particularly when it involves “kindness”.
Yet as the “virtual” and “real” worlds that we move between become increasingly indistinct, the boundaries of consent must be drawn. Maree said she felt “dehumanised” and treated like “clickbait” – and that’s because she was. Consenting to be in a public place does not mean consenting to be immortalised in it online, just as consenting to be on Twitter is not the same as consenting to reading every tweet that then crosses your screen. Unlike traditional media, where you have a general sense of a publication or news channel before you choose to buy or watch it, the internet’s scope means that we cannot possibly opt in to everything that we might come across. Of course, this is not a violation equivalent to other experiences in which consent is disregarded. It is not inherently traumatising or distressing. But it does represent a constant erosion of personal choice in our day-to-day experience of the world. Maree’s story is an example of what happens when the seemingly harmless logic of the content stream migrates into real life: a direct removal of agency resulting in significant personal distress.
Content from our partners
People frequently decry the fact that social media is an “echo chamber”: we choose to follow certain people who share our views and general outlook on life, and never experience dissenting voices. If that were ever true, at least our chambers were of our own choosing. Increasingly our infinite feeds bombard us with unasked-for information that interrupts – even replaces – our quiet time, just like Pawluk interrupted Maree’s. The damage was done as soon as he covertly turned on his camera, just as the damage is done on social media before we’ve had the chance to opt out.
In everyday life, we expect to have the fundamental freedom to converse with the people we like, engage in activities we enjoy and drink a coffee without being turned into a meme. We do this in the knowledge that, although we will inevitably at times be interrupted by inconvenience and hardship, we have broadly made our own choices. Why shouldn’t our online lives be the same?
[See also: The problem with Instagram’s new child “age estimation” software]