‘It’s just a game’: Misogyny, violence and suicide and self-harm risks in online gaming narratives

Dr. Paula Bradbury
Dr. Paula Bradbury
Principal Growth Subject Matter Expert at Resolver a Kroll business
· 6 minute read
Online gaming, misogyny, suicide and self harm

The success of the recent Netflix series, ‘Adolescence’ has launched some critical conversations around the vulnerability of adolescents, especially males, to online toxicity fuelled by a manosphere of violent narratives aimed at women. Whilst this series has thrown this issue into the lime light, it is by no means an emerging phenomenon.

In light of this wider debate over the systemic nature of digital misogyny, this blog explores how toxic gaming environments, driven by anonymity, disinhibition, and harmful content, normalize violence against women and girls, both in-game and in real life. It also delves into the psychological impact of such narratives on adolescents across online social and gaming spaces, and explores the growing trust and safety challenges posed by immersive virtual technologies.

Violence against women and girls has always existed in every social sphere and the online gaming environment is no different. A perfect example of this is the recent backlash against the video game ‘No Mercy’ that launched on a mainstream gaming platform in which the objective of the male protagonist is to “become every woman’s worst nightmare”, and “never take no for an answer.” The creators of the game openly state that this game includes “unavoidable non-consensual sex” which, by any other name, is rape.

When games exist that emphasise the domination and violation of women under headers such as “make all women yours”, it reveals the accessibility that children and adults have to games that promote the development of harmful sexual ideologies and attitudes towards women.

Why are misogynistic narratives so prevalent in online gaming narratives?

A growing body of research shows that repeated exposure to such misogynistic video games contributes towards a normalisation of violence that, over time, increases desensitisation, the desire for increased exposure to more graphic content, has a detrimental impact on moral reasoning, and can increase offline aggression – especially amongst boys.

The promotion of misogynistic narratives in online gaming spaces is further enabled by a broader phenomenon of “online disinhibition”, which suggests that certain characteristics of the internet, such as anonymity, asynchronous communication, and lack of real-world consequences create fertile ground for misogyny and other toxic user behaviors to surface.

Factors such as dissociative anonymity and invisibility allow players to feel detached from their actions, leading to reduced empathy and accountability. Players view their actions as being not real, fantasy not reality, and devoid of real-world harm.

Online gaming, misogyny, suicide and self harm

Online games, especially competitive multiplayer formats, often prioritize dominance, control, and hierarchy—dynamics that can interact problematically with existing societal gender norms. When female players enter these male-dominated spaces, their presence can be perceived as threatening to the status quo.

The perceived intrusion of women into “male” spaces may trigger hostile reactions that are amplified by the psychological affordances of online disinhibition. For example, players may feel emboldened to harass or belittle women under the protective veil of pseudonyms and avatars, interpreting this behavior as inconsequential or even humorous.

In the context of misogyny, this means female players may be reduced to stereotypes or caricatures—such as “fake gamers” or sexualized objects—rather than being treated as real individuals. This process dehumanizes women and paves the way for further aggression, particularly when compounded by echo chambers within gaming subcultures that normalize or even valorize misogynistic discourse.

This distortion between a player’s real-life persona and online avatar enables the player to separate themselves from accountability for their actions in-game, often characterized by the assertion – “it wasn’t me, it was my avatar”.

How do Avatars influence players’ gaming behaviors?

Avatars not only serve as a guise to increase anonymity, they have proven to be a powerful influence over online behavior as players adopt attitudes and behaviors that align with the avatars’ appearance.

Avatars that represent ethereal characteristics are more likely to display traits such as kindness and passivity, whereas gamers who choose avatars that represent combat or demonic features, are more likely to display traits that are aggressive or evil towards others.

This is often seen in popular sandbox and online role-playing games, where the adoption of characters, especially in games where the main objective is to engage in violence, amplifies the degree of violent, often misogynistic, dialogue between players. Especially if a female player is identified.

Violent gaming content is easily accessible regardless of whether you have access to a games console. When they are not playing online games children can view violent gaming content shared on social media and video streaming platforms via any mobile device.

Online gaming, misogyny, suicide and self harm

Whilst the majority of this content is developed and shared to guide others or highlight particularly amusing encounters, a significant volume is produced and shared by female gamers to highlight the continuous streams of misogynistic rhetoric which is clearly, and often, voiced by young males. Whereby the males in a game are acting as a collective to isolate and demean the solo female.

What is the link between online gaming behaviors and suicide and self-harm?

Positive correlations have also been found between problematic online gaming behaviors and suicide and self harm.

Whilst gaming can provide a positive outlet for many regarding their mental health, there are concerns amongst clinical professionals that gamers with pathological internet use (PIU) evidenced emotional symptoms, conduct disorder, hyperactivity/inattention, self-injurious behaviors, and suicidal ideation and behaviors, especially amongtst males.

The fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) now includes the recognition of “internet gaming disorder” and “gaming disorder” as a psychiatric disorder in which high levels of gaming have directly contributed to dysfunctional symptoms.

Online gaming, misogyny, suicide and self harm

It has also be argued that for adolescents, stress is a main risk factor for suicide, especially amongst those with poor emotional regulation or problematic gaming. Difficulties or deficiencies in emotional regulation, synonymous with adolescence due to the immaturity of their pre-frontal cortex, have been found to increase maladaptive behaviors for those with internet gaming disorders.

How is technology changing risks to children in online gaming spaces?

At Resolver, we have seen evidence that misogyny is embedded in gaming culture, from casual slurs in voice chats to structural gender imbalances in game development and marketing. Viral images and jokes act as ideological vehicles, distorting feminism into a caricature of irrationality or misandry.

These memes do not exist in isolation but are part of what scholars call “networked misogyny”, a pattern of rhetoric and behavior that links disparate online communities, or manospheres, through shared hostility toward women. These networks can be found in all corners of the internet, not just in gaming, but the future of gaming is changing and the landscapes in which violence towards women and girls thrives is taking on a new proximal position.

Whilst ‘metaverses’ are often misconstrued as being a recent technological development they have been in existence for years in gaming. For example, World of Warcraft has been a popular metaverse game since it launched in 2004. However, the introduction of metaverse games in which there is the utilisation of virtual or augmented reality is changing the landscape of interactive experiences, and risk.

Children can now wear headsets that transport them into 360 gaming experiences that provide new levels of thrill and excitement. Where they can explore archeological sites in Peru or kayak in Antarctica. Providing a new era of increased accessibility and inclusion. Whilst these advancements should be celebrated it is essential to consider the emergence of new risks that such technology poses to children.

Cases of sexual violence against women and children are now coming to light, and whilst accessibility to haptic suits is scarce due to the current price tag running into the tens of thousands, the future of online harms brings a new era of sexual violence and exploitation that will prove challenging for content moderation and law enforcement. 

Online gaming, misogyny, suicide and self harm

Worlds where fear, hate and violence are amplified through a sensory immersive state in which feelings of proximity are drawn closer and a sense of reality lost. It has never been more vital that the gaming industry listens to Trust and Safety experts.

How can Resolver help gaming platforms ensure safer online spaces?

As subject matter experts at Resolver, we know that there are multiple ways in which children are vulnerable to risks attributable to inappropriate content and how impressionable they are in their idolisation of popular content creators.

Children look to content creators, including gamers, to determine what types of behavior is expected of them. To identify behaviors that could make them, and their own content, be accepted and receive ‘likes’ or ‘follows’. Their awareness of harm, willingness to share personal information and desire to be accepted and liked, are influenced by their level of cognitive, social and emotional development.

Manospheres cater to adolescent vulnerability. Especially for young males who may feel isolated and rejected offline. Television series such as Adolescence raise important points of discussion around moderation, safeguarding and education. This includes:

  • The manner in which platforms can improve their moderation of misogynistic rhetoric that is audio rather than solely text based
  • Whether maladaptive behaviors in young males created by problematic gaming habits should be considered an issue of national priority
  • A national debate over the need to update the national curriculum taught in schools to specifically address misogyny and the detrimental impact that exposure to harmful content has on the well being of students.

Resolver works directly with online platforms to help mitigate such risks by identifying harmful content and harmful actors. Only by utilising the knowledge and skills of experts in online harms can platforms not only identify existing risks on platforms, but also emerging trends and the threats such content poses to the most vulnerable in society, children.

Subject Matter Experts at Resolver do not only identify risks on platforms for their clients, they also provide in-depth explanation on how and why harmful content poses a risk to platform users and organisational reputations. Helping to ensure that digital spaces are safe spaces that can be enjoyed by all.

Conclusion

The pervasive issue of misogyny and violence in online gaming is not a product of isolated incidents or niche communities—it is a systemic problem embedded within gaming cultures, platforms, and broader digital ecosystems. As games evolve in complexity and immersion, so too do the risks they pose, particularly to vulnerable adolescents navigating identity, acceptance, and social norms.

The narrative that “it’s only a game” dangerously dismisses the real-world impact of exposure to toxic content, particularly in shaping attitudes toward women and normalizing harmful behaviors. Resolver’s work underscores the urgent need for industry accountability, robust moderation, and meaningful collaboration with subject matter experts.

Safeguarding children from digital misogyny demands a coordinated effort involving game developers, platforms, educators, and policymakers. Only by addressing the structural and cultural foundations of online toxicity can we begin to create inclusive, safe, and respectful spaces for all players—regardless of gender.

To learn more about how Resolver’s comprehensive Trust and Safety Intelligence service can help protect the integrity of your platform and online communities please reach out.

Get free monthly risk and threat insights with the Resolver Safety Brief. Subscribe Today
Table Of Contents

    Speak to an Expert

    By clicking the button below you agree to our Terms of Service and Privacy Policy.
    If you see this, leave it blank.