Shows vigil in Plymouth, United Kingdom following a shooting in 2021 with an icon signifying UN Sustainable Development Goal five
Members of the public attend a vigil for the victims of the Keyham mass shooting at North Down Crescent Park in Keyham, Plymouth, August 13, 2021. Credit: PA Images / Alamy Stock Photo

Using an ecosystems approach to study the incelosphere: Prof. Debbie Ging collaborates with University of Exeter colleagues to move incel research forward

 

In August 2021, Jake Davidson’s attack in Portsmouth left five people dead. In 2020, two UK teenagers who also engaged with the online incel community to some degree, faced trial for possessing terrorism-related materials.

These and other cases, such as Alek Minassian’s attack in Toronto in 2018,  have sparked public debate and academic discussion on whether incel ideology should be considered extremist. Furthermore, should those who are motivated by its ideas to carry out acts of violence be considered terrorists?

The icon on this article reflects the research's contribution to UN Sustainability Goal 5: Gender Equality. The Sustainable Development Goals are 17 objectives designed by the United Nations to serve as a shared blueprint for peace and prosperity for people and the planet.

What does the term incel mean?

Incel stands for involuntary celibate, and members of this community believe that they are victims of both genetics and feminism. A significant challenge in researching incels through the lens of extremism is that the incel subculture and worldview are amorphous in nature. Many incels do not support or engage in violence , which makes it hard for researchers and practitioners to frame the incelosphere in the traditional language of terrorism and counter-terrorism. While some research has argued that the subculture exhibits all of the characteristics of an extremist ideology, structured by an opposition between an in-group and harmful out-group(s), other work points to the heterogeneity of the subculture and focuses more on the socio-economic and psychological conditions that predispose men to becoming involved in male supremacist ideologies. 

There is general agreement, however, that incel online discussions contain high levels of misogynistic, racist and transphobic language. Incels view society as a ‘sexual marketplace’, and draw on concepts from evolutionary psychology to categorise people into hierarchical groups based on their looks. In this schema of heterosexual mate selection, women are posited as cruel gold diggers who seek out ‘alpha’ males for sex and procreation but, due to scarcity, often settle for beta males to pay their bills. Incels position themselves at the bottom of this hierarchy, and are further categorised into those who still aspire to ‘ascend’ (to beta status) and those who have given up hope entirely (i.e. taken the “Black Pill”). 

Moving incel research forward

Research on incel online spaces has proliferated over the past few years, especially in the wake of Minassian’s attack in April 2018, with efforts mostly concentrated on deciphering the jargon-heavy lexicon and underlying worldview. Building on my own contextualization of incel communities as a particularly problematic corner of the broader ideational landscape of the “manosphere” (Ging, 2019), our team on the Con.Cel project used computational text analysis methods to analyse the toxicity of the incel worldview.

Working with Dr. Lewys Brace and Dr. Stephane Baele from the University of Exeter, we sought to gain a diachronic view of the community’s evolution over time, as opposed to the snapshot analyses that dominate most of the research on this topic. We used an ‘ecosystem’ approach to studying extremist online spaces, scaping a large dataset from sub-Reddits, chans, forums, Telegram, and Instagram accounts. These online spaces are best understood together as a dynamic network or interacting units, with incels operating across them in complex ways in what we collectively refer to as the ‘incelosphere’.  

A series of custom-built Python web scrapers were used to both collect and analyse all text data, and accompanying metadata from these sites (33 different online spaces covering the period 2013-2022, yielding a dataset consisting of 11,717,516 posts). This approach allowed for an evaluation of how extreme the content and discussions on these platforms have been over time. To evaluate the proportion of violent language in this collected content, we used a custom ‘dictionary’ or lexicon, named the ‘Incel Violent Extremism Dictionary’ (IVED). This lexicon contained three types of words found in the collected data; dehumanising out-group nouns, violent verbs, and nouns related to weapons. We used this dictionary to carry out two types of analyses, tracking the evolution of this type of language across time and platforms.

Key findings

Our analysis indicates steady overall increase of violent extremist language in the main lineage of incel online spaces across six years. Our findings (published here) also demonstrate the heterogeneity of the incelosphere, which is constituted by a range of platforms whose respective toxicity can vary widely. We also conducted an in-depth case-study tracing the emergence of MUU (‘mixed, unclear, and unstable') ideological constructs from the incelosphere, by tracking how the subculture using outlinking to steer traffic between different online ecosystems (i.e. incels and far-right) in unexpected ways. While most of the literature on online extremism focuses on echo-chamber dynamics and the centripetal forces of what we call endolinks, our findings (published here) break new ground by emphasising the simultaneous importance of centrifugal ideological forces exerted by multiple exolinks. The final paper in this series (forthcoming) is a large-scale analysis of the visual culture of the incelosphere, which categorises the dominant visual styles and ideological themes in this subculture’s use of memes, symbols and avatars. 

The Con.Cel project was carried out with the CREST centre, which is funded by the UK Home Office to produce research to aid the UK government, and intelligence and security agencies assess security threats. The project has been an important aspect of my ongoing interdisciplinary work in tracking how male supremacist and other extreme ideologies are spread and amplified online. The sample size of known incel attackers is currently, and fortunately, still quite small. However, analysis of their socio-economic and psychological contexts, combined with the kind of data-driven work summarised here, indicates the need to develop a nuanced understanding of the subject. In particular, the relationship between online discussions and offline factors as a motivator towards violence requires further examination. 

 

Author(s)

DCU researcher leads landmark study examining school experiences of young deaf and hard of hearing people

DCU report provides comprehensive guidance