The success of the recent Netflix series, Adolescence has launched some critical conversations around the vulnerability of adolescents, especially males, to online toxicity fueled by a manosphere of gender-hostile narratives aimed at women. While this series has thrown this issue into the limelight, it is by no means an emerging phenomenon.
This blog examines how toxic gaming environments, driven by anonymity, disinhibition, and harmful content, normalize violence against women and girls in-game and offline. It also reviews the psychological impact of these narratives on adolescents and the growing Trust and Safety challenges created by immersive virtual technologies.
Violence against women and girls has always existed in every social sphere and the online gaming environment is no different. A recent example is the backlash against the game “No Mercy,” launched on a mainstream platform, where the male protagonist’s objective is to “become every woman’s worst nightmare” and “never take no for an answer.” The creators of the game openly state that this game includes “unavoidable non-consensual sex” which constitutes rape under widely accepted legal definitions.
When games exist that emphasize the domination and violation of women under headers such as “make all women yours”, it reveals the accessibility that children and adults have to games that promote the development of harmful sexual ideologies and attitudes towards women.
Why are misogynistic narratives so prevalent in online gaming narratives?
A growing body of research shows that repeated exposure to misogynistic video games normalizes violence. Over time, this increases desensitization, drives demand for more graphic content, harms moral reasoning, and can increase offline aggression, especially among boys.
The promotion of misogynistic narratives in online gaming spaces is further enabled by a broader phenomenon of “online disinhibition”, which suggests that certain characteristics of the internet, such as anonymity, asynchronous communication, and lack of real-world consequences create fertile ground for misogyny and other toxic user behaviors to surface.
Factors such as dissociative anonymity and invisibility allow players to feel detached from their actions, leading to reduced empathy and accountability. Players view their actions as being not real, fantasy not reality, and devoid of real-world harm.

Online games, especially competitive multiplayer formats, often prioritize dominance, control, and hierarchy — dynamics that can interact problematically with existing societal gender norms. When female players enter these male-dominated spaces, their presence can be perceived as threatening to the status quo.
The perceived intrusion of women into “male” spaces may trigger hostile reactions that are amplified by the psychological affordances of online disinhibition. For example, players may feel emboldened to harass or belittle women under the protective veil of pseudonyms and avatars, interpreting this behavior as inconsequential or even humorous.
In the context of misogyny, this means female players may be reduced to stereotypes or caricatures — such as “fake gamers” or sexualized objects — rather than being treated as real individuals. This process dehumanizes women and paves the way for further aggression, particularly when compounded by echo chambers within gaming subcultures that normalize or even valorize misogynistic discourse.
This distortion between a player’s real-life persona and online avatar enables the player to separate themselves from accountability for their actions in-game, often characterized by the assertion – “it wasn’t me, it was my avatar”.
How do Avatars influence players’ gaming behaviors?
Avatars not only serve as a guise to increase anonymity, they have proven to be a powerful influence over online behavior as players adopt attitudes and behaviors that align with the avatars’ appearance.
Avatars that represent ethereal characteristics are more likely to display traits such as kindness and passivity, whereas gamers who choose avatars that represent combat or demonic features, are more likely to display traits that are aggressive or evil towards others.
This pattern often appears in sandbox and online role-playing games, where violent objectives amplify aggressive or misogynistic dialogue between players — especially when other players recognize a participant as female.
Violent gaming content is easily accessible, even if you don’t have access to a dedicated games console. When they are not playing online games, children can view violent gaming content shared on social media and video streaming platforms via any mobile device.

While much of this content is shared to guide or entertain, female gamers also document the abuse they face, posting videos that show persistent misogynistic rhetoric, often voiced by young male players. Many of these clips capture moments where groups of male players act together to isolate or demean a lone female participant.
What is the link between online gaming behaviors and suicide and self-harm?
Positive correlations have also been found between problematic online gaming behaviors and suicide and self harm.
While gaming can provide a positive outlet for many regarding their mental health, there are concerns amongst clinical professionals that gamers with pathological internet use (PIU) evidenced emotional symptoms, conduct disorder, hyperactivity/inattention, self-injurious behaviors, and suicidal ideation and behaviors, especially amongtst males.
The fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5) now lists Internet Gaming Disorder as a condition warranting further study, reflecting growing clinical concern about the link between excessive gaming and dysfunctional symptoms.

It has also be argued that for adolescents, stress is a main risk factor for suicide, especially amongst those with poor emotional regulation or problematic gaming. Difficulties or deficiencies in emotional regulation, synonymous with adolescence due to the immaturity of their pre-frontal cortex, have been found to increase maladaptive behaviors for those with internet gaming disorders.
How is technology changing risks to children in online gaming spaces?
Resolver analysts observe persistent misogyny in gaming culture, from casual slurs in voice chats to structural gender imbalances in development and marketing. Viral images and jokes often become ideological vehicles that distort feminism into a caricature of irrationality or misandry.
These memes do not exist in isolation. They form part of what scholars call “networked misogyny” — a pattern of rhetoric and behavior that connects disparate online communities through shared hostility toward women. These networks operate across the wider internet, not only in gaming spaces. As gaming environments become more immersive and interactive, they create new contexts where violence toward women and girls can escalate and feel more proximate.
Metaverse-style environments are not new in gaming; they have existed for years. For example, “World of Warcraft” has been a popular metaverse game since it launched in 2004. However, the introduction of metaverse games in which there is the utilization of virtual or augmented reality is changing the landscape of interactive experiences, and risk.
Children can now use headsets that transport them into 360-degree gaming experiences, letting them explore archaeological sites in Peru or kayak in Antarctica. This provides a new era of increased accessibility and inclusion. While these advancements should be celebrated, it is essential to consider the emergence of new risks that such technology poses to children.
Documented cases of sexual harassment and simulated sexual violence in immersive environments are now emerging in research, news and user reports. And while accessibility to haptic suits is scarce due to the current price tag running into the tens of thousands, the future of online harms brings a new era of sexual violence and exploitation that will prove challenging for content moderation and law enforcement.

These worlds amplify fear, hate, and violence through immersive sensory environments that blur proximity and distort a player’s sense of reality. It is increasingly important for gaming companies to incorporate Trust and Safety expertise into product and moderation decisions.
How can Resolver help gaming platforms ensure safer online spaces?
As subject matter experts at Resolver, we know that there are multiple ways in which children are vulnerable to risks attributable to inappropriate content and how impressionable they are in their idolization of popular content creators.
Children look to content creators, including gamers, to determine what types of behavior is expected of them. To identify behaviors that could make them, and their own content, be accepted and receive likes or follows. Their awareness of harm, willingness to share personal information and desire to be accepted and liked, are influenced by their level of cognitive, social and emotional development.
Manosphere communities often appeal to adolescents who feel isolated or rejected offline. Television series such as Adolescence raise important points of discussion around moderation, safeguarding and education. These include:
- The manner in which platforms can improve their moderation of misogynistic rhetoric that is audio rather than solely text based
- Whether maladaptive behaviors in young males created by problematic gaming habits should be considered an issue of national priority
- A national debate over the need to update the national curriculum taught in schools to specifically address misogyny and the detrimental impact that exposure to harmful content has on the well being of students.
Resolver works directly with online platforms to help mitigate such risks by identifying harmful content and harmful actors. Platforms benefit most when they draw on specialist expertise in online harms to identify existing risks on platforms, and also emerging trends and the threats such content poses to the most vulnerable in society, children.
Subject Matter Experts at Resolver do not only identify risks on platforms for their clients, they also provide in-depth explanation on how and why harmful content poses a risk to platform users and organizational reputations. Helping to ensure that digital spaces are safe spaces that can be enjoyed by all.
The path forward for safer gaming spaces
The pervasive issue of misogyny and violence in online gaming are not isolated incidents or niche communities. They are systemic problems embedded within gaming cultures, platforms, and broader digital ecosystems. As games evolve in complexity and immersion, so too do the risks they pose, particularly to vulnerable adolescents navigating identity, acceptance, and social norms.
The narrative that “it’s only a game” overlooks the real-world impact of exposure to toxic content, particularly in shaping attitudes toward women and normalizing harmful behaviors. Resolver’s work underscores the urgent need for industry accountability, robust moderation, and meaningful collaboration with subject matter experts.
Safeguarding children from digital misogyny demands a coordinated effort involving game developers, platforms, educators, and policymakers. Only by addressing the structural and cultural foundations of online toxicity can we begin to create inclusive, safe, and respectful spaces for all players — regardless of gender.
To learn more about how Resolver’s comprehensive Trust and Safety Intelligence service can help protect the integrity of your platform and online communities, please reach out.