In 2015, I started researching an obscure, underground network of online men’s rights groups collectively known as the manosphere. While this disparate ensemble of pickup artists, “incels” and “MGTOWs” (men going their own way) was a seething hotbed of male disaffection and anti-woman sentiment, it was also a collection of predominantly niche subcultures, confined to the margins of the digital world.
Almost 10 years later, many of the manosphere’s core ideas have spread into mainstream culture. Hitherto obscure terms such as “normie”, “incel”, “the red pill” and alpha, beta and sigma masculinity have become part of everyday online and schoolyard vernaculars. For those fortunate enough to be unfamiliar with this lexicon, “incels” are involuntary celibates or men who believe they are denied sex because of their looks.
To be “red pilled” is to be “enlightened” to the misandrist conspiracy that is contemporary liberal society. “Normies”, by contrast, are deluded or blue-pilled. Alpha, beta and sigma males are archetypes in the male socio-sexual hierarchy who enjoy different levels of power and access to women due to their looks and innate personality traits. The amplification of these ideas is partly due to the appeal of memes and the subcultural capital they confer on their creators and recyclers. It has also been helped along by popular platforms such as Urban Dictionary, 10 per cent of whose content we found to be misogynistic in a 2020 study.
The main drivers of this phenomenon, however, are the social media platforms’ algorithmic targeting of specific demographic profiles using recommender functions, coupled with the rise of influencer culture. The growth of influencer culture on TikTok, in particular, has platformed a significant number of highly influential ideological entrepreneurs such as Andrew Tate, Myron Gaines and Adrian Markovac who strategically monetise male insecurity by offering spurious advice on mental health, moneymaking and fitness but are primarily driven by male supremacist and other anti-progressive agendas.
In 2023, a survey by the UK organisation Hope Not Hate found that eight in 10 boys between the ages of 16 and 17 had seen material from Tate and that 45 per cent of 16 to 24-year-old young men had a positive opinion of him. Many teachers have also noticed a climate of increasing misogyny and antifeminism in their classrooms, and some have reached out to NGOs and academics for advice on how to tackle the “Tate effect”.
Most boys and men don’t go looking for this material, but once the algorithm figures out they are male and show an interest in cars, making money or improving their physical or mental health, it rapidly starts recommending manfluencer content. As one boy aged 16-17 in a study by Internet Matters (2023) commented, “It’s really easy to go down that path, if you like one video, suddenly ... it’s all you get after a while if you’re not careful”.
However, most social media companies do not disclose how their algorithms work, with the result that they effectively operate as black boxes. In response to this lack of algorithmic transparency, my colleagues in the DCU Anti-Bullying Centre and I recently conducted an experimental study that tracked, recorded and coded the content recommended to 10 experimental or “sockpuppet” accounts on 10 blank smartphones, by YouTube Shorts and TikTok. We found that all of the male-identified accounts were fed large volumes of masculinist, antifeminist and other extremist content, irrespective of whether they sought out general or male supremacist-related content. In addition, they all received this content within, at most, the first 23 minutes of the experiment, but in one case within two minutes.
Once the account showed interest by watching, the amount rapidly increased. By the last round of the experiment, the vast majority of content being recommended to the phones was toxic (76 per cent on TikTok and 78 per cent on YouTube Shorts), primarily falling into the alpha male and antifeminist categories.
Much of this content rails against equality and promotes the submission of women, but we also identified far-right, anti-trans and conspiracy material, as well as a large amount of content devoted to male motivation, moneymaking and mental health. This material is especially concerning as it frequently claims that depression is weakness, therapy is ineffective and the solution to men’s problems is a revival of military-style stoicism, a phenomenon referred to by podcaster Aleks Hammo as the “stoic industrial complex”.
Manfluencers purport to care about men, while reinforcing precisely the kinds of restrictive and outmoded norms that harm them, and others.
The findings of our report point to urgent and concerning issues for parents, teachers, policymakers and society as a whole. The overwhelming presence of Tate in our data set at a time when he was de-platformed indicates that social media companies must tackle harmful content in more sophisticated ways. We recommend turning off recommender algorithms by default as well as the teaching of critical digital literacy skills in schools to equip young people with a better understanding of how influencer culture and algorithms work.
Girls and women are undeniably the most severely impacted by these beliefs, but they are also damaging to the boys and men who consume them. The social media companies must come under increased pressure to prioritise the safety and wellbeing of young people over profit.
Debbie Ging is Professor of Digital Media and Gender at Dublin City University. The full report, Recommending Toxicity: the role of algorithmic recommender functions on YouTube Shorts and TikTok in promoting male supremacist influencers, by Catherine Baker, Debbie Ging and Maja Brandt Andreasen is available at antibullyingcentre.ie/
- Listen to our Inside Politics Podcast for the latest analysis and chat
- Sign up for push alerts and have the best news, analysis and comment delivered directly to your phone
- Find The Irish Times on WhatsApp and stay up to date