Online Harassment Hurts Just as Much: Unpacking Gendered Digital Harm in the Age of AI
It’s not unusual for young people in a workshop to teach me something new. In fact, it’s often one of my favourite things. It could be a change in language, an experience I wasn’t aware of or that my crusty Air Force’s really need replacing. Recently one conversation really has stood out though. I was running a workshop and we were talking through various different examples of public sexual harassment when one student mentioned that harassment happens way more on snapchat but nobody takes it as seriously as someone saying it to your face. And of course, we made the space to talk about that. We spoke about how that might make someone feel, why that might be and what accountability can look like.
These questions are not dissimilar to the kinds of questions I would have opened up with a young person about any kind of sexual harassment or harm. But it did leave me questioning, why does it feel so hard, and so vast to create solutions to digital harm? Why do we feel so powerless to it? So I am here to tell you some things I've been thinking about. In this 7 part blog series over the 16 Days of Activism to Eradicate Gender Based Violence I will be sharing some reflections on some incredible books and resources and at the end of each blog i’ll give you one tip and one resource that might help you hold some of those more challenging conversations with the young people in your life. Whether you're a teacher, youth worker, parent or the ‘cool aunt / uncle / pibling’ I hope there is something in here for you.
What even is digital harm?
Simply put, it’s any form of harm that happens on online platforms; this could be anything from online grooming to intimate image abuse. A particular change in the way we think about digital harm comes from the advancements in Artificial Intelligence (AI). But what even is AI? There are three types of AI, according to IBM.
Artificial Narrow AI: This is the only kind of AI that exists today. Essentially, it’s a technology that can ‘be trained to perform a single or narrow task’ (IBM)
General AI: This is a theoretical concept. It can take, ‘previous learnings and skills to accomplish new tasks in a different context without the need for human beings to train the underlying models. This ability allows AGI to learn and perform any intellectual task that a human being can.’ (IBM)
Super AI: Another theoretical concept, the idea that AI could ‘ think, reason, learn, make judgements and possess cognitive abilities that surpass those of human beings.’ (IBM)
The most common type of artificial narrow AI we might come across would be Generative Artificial Intelligence (Gen AI), this would include platforms like ChatGPT, Gemini etc. These are often large language models (LLMs) which pull a huge amount of publically available data, as well as the data you input to give you answers. These LLMs can now produce images, video, sound clips as well as text answers. The use of Gen AI / LLMs has been widely accepted, but there is sadly misuse of this technology that is harming women, girls and those of marginalised genders in new ways.
‘It’s not that deep’
When diving deeper into understanding digital harm one thing came through again and again, as it did in the conversation with the student I mentioned above. Online abuse is often deemed ‘not as serious’ as offline abuse. The general feeling of ‘it’s not that deep’ comes through again and again from those criticising the evidence base that shows the scale of this harm. This made me think of two things: (1) Adele Zeynep Walton (2) Our Streets Now’s work on Public Sexual Harassment.
I wanted to use this as an opportunity to tell you about Adele Zeynep Walton’s story. Walton recently wrote a book ‘Logging Off: The Human Cost of Our Digital World’, which you should definitely read or listen to. In this book she talks about her, and her family's experience of grief after her sister, Aimee Walton, took her own life. After Aimee died, it had become apparent she was one of the only Incel forums that women and girls were allowed to join, where men and boys would instruct people on how to kill themselves. I had the pleasure of hearing Adele Zeynep Walton talk alongside Jess Davis’ as part of their book tours. Adele spoke about how she doesn’t delineate between offline and online worlds, as the two worlds are now so merged that people can’t delineate themselves. People meet their friends/partners online, socialise online, share their stories online, work online and spend a whole lot of their time communicating online. I had always imagined them as distinctly different parts of my life, my online and offline life. But, at least to me, Walton was right. My life really weaves between the two in a way that means I can’t separate them at all. And for Walton and her family, the separation between the two is impossible because of the human cost and consequence of online activity. So, why is it so common for people to say that harm offline holds more weight than harm online?
The second thing this made me think of was how we (at Our Streets Now) have been working to make sure that what normalised forms of gender based violence, like public sexual harassment, are taken seriously. That every day harassment holds weight because of its relentlessness, the way we talk it down and the way it’s seen as normal. All the arguments to not take digital harm seriously feel so similar to all these myths we've been busting for years, just rather than it being on your walk home, it’s coming to you through your phone.
With that in mind, I want to say before you continue reading that so much of the research I have done about digital harm has made me feel sick to my stomach. So, please consider this as a content warning for the rest of this blog. Some of this information might make you feel uncomfortable and that’s okay, because it is uncomfortable when we’re confronted by the reality and scale of harm. If you have experienced any of this kind of abuse, I am really sorry that happened to you and the one thing that has continued to ring in my ears writing and researching is the famous quote from Gisèle Pelicot ‘shame must change sides’. All of this to say, it is that deep and digital harm, like public sexual harassment, although normalised is not normal.
Understanding Different Types of Digital Harm
‘’Globally, 38% of women have had a personal experience of online violence, while 85% of women who spend their time online have witnessed digital violence against another woman.’’ (The Economist, 2021)
There is a vastness to digital harm, which is impossible to capture in this blog series, so I've had to focus on a few key areas. I've omitted some of the literature on sex robots, cyberbrothels and AI chat bots but if you are interested in knowing more I recommend having a look at Laura Bates’s book ‘The New Age of Sexism’ and Jess Davies book ‘No One Wants TO See Your D*ck’ which dive into these topics in way more detail. For now, we’ll focus on five areas: Intimate Image Abuse, Deepfakes, Online Grooming, Masculinity, Influencers & Incels and Money. I hope you come back to read more.
A Tool & A Resource
Tool: This is a conversation not an argument. You and the young person you're talking to might have wildly different experiences of the online world, and you might have different perceptions of whether you can detangle the online and offline. Come into these conversations to learn as well as impart knowledge, and remember to listen as much or more than you’re talking. And the big thing, you can’t win a conversation.
Resource: The Mix, free, confidential support for young people under 25. You can also look at a wide range of support services on our website here.