Under the spotlight: Lionesses face online abuse at 2023 FIFA World Cup

Resolver's analysis of England Women's Football Team sheds light on the need for proactive measures to curb online abuse.

Resolver
September 11, 2023 · READ

Resolver analyzed over 84,000 posts directed at the verified social media accounts of four first-team players from the England Women’s National Football team: Millie Bright, Jess Carter, Lauren James, and Niamh Charles between July 20 and August 20, 2023. Our investigation revealed a steady stream of harmful content targeting the players throughout the World Cup highlighting the continuing risks to player well-being and mental health posed by the spread of online abuse.

Further scrutiny of this dataset shows that spikes in abuse and harassment towards players directly correlated with on-pitch events, with the highest frequency of abuse seen during match day fixtures for the Lionesses over the month. August 7th-8th saw the highest level of vitriol directed towards the four athletes, with over 1,100 abusive messages observed over just two days following England’s fixture against Nigeria in the tournament’s knockout stages.

Personal abuse, reputation damaging accusations, and hate speech were the most predominant categories of harmful content directed towards the athletes with our automated systems registering the use of phrases such as ‘racist’, ‘shame’, ‘kill’, ‘stupid’ and ‘ugly’ hundreds of times in posts directed towards the players over the World Cup.

Beyond the beautiful game

In December 2022, FIFA and FIFPRO launched a Social Media Protection Service (SMPS) days ahead of the men’s FIFA World Cup 2022 to combat the torrent of hate and abuse directed towards participating players and safeguard their mental health from such harmful content.

A subsequent report published by the body following the tournament’s conclusion revealed the monitoring system scanned over 20 million messages across several mainstream social networks, with 19,636 posts eventually flagged and reported to the relevant platforms as abusive, discriminatory, or threatening to the players.

A new Resolver analysis into the proliferation of harmful content targeting athletes at major sporting events suggests that online abuse remains a persistent and pervasive challenge for athletes, professional clubs, and major corporate sponsors associated with the world’s most popular sport.

Under the spotlight

Resolver analyzed 84,536 individual pieces of content sent to the verified social media profiles of four first-team players on Twitter and Instagram over the month of the World Cup. This examination revealed 1,845 pieces of harmful content targeting the accounts between July 20 and August 20, 2023, averaging over 59 abusive messages per day across the network over the same examined month.

In particular, the highest frequency of harmful content was observed between August 7-9, coinciding with England’s match against Nigeria in the tournament’s knockout stages. A red card and two-match ban for defender Lauren James in the 87th minute of a thrilling game drew the ire of online commentators, leading to a cascade of abuse directed towards the player online.

Spikes in harmful content over time

Graph showing the risk classifications for harmful content on the verified accounts of Millie Bright, Jess Carter, Niamh Charles, and Lauren James between July 20 and August 20, 2023. (Source: Resolver)

Troublingly, this abuse did not subside despite the player apologizing via her official account on August 8, with personal attacks against the player peaking in the minutes following the red card. Around 59% of the harmful content observed across the network was observed in the forty-eight hours following this on-pitch incident.

Timeline of hateful content following incident

Graph showing an hourly distribution of harmful content across the network of accounts between August 7-9, 2023. The spike in abuse begins in the direct aftermath of the red card and continues till the evening of August 9. (Source: Resolver)

Analysis of harmful content targeting england footballers during 2023 fifa women's world cup

Table showing the distribution of total posts, harmful posts, and average distribution of harmful content across the network of four accounts.

The results of our investigation conform with findings from prior studies into the proliferation of online abuse targeting athletes that suggest on-pitch events act as catalysts for spikes in insults and online threats. To further verify this hypothesis, these results were compared with a similar analysis of harmful content in the month preceding the tournament, revealing a 29x increase in online abuse targeting the accounts during the World Cup.

From goals to hate

Examining the harmful content targeting the network of accounts also revealed that personal abuse, reputation damaging accusations, and hate speech were the most predominant categories of harmful content directed at the players. Phrases such as ‘shame’ (209), ‘racist’ (142), ‘stupid’ (92), and ‘disgusting’ (88) were employed hundreds of times in messages directed at the players accounts over the course of the competition.

Top five risks faced by players

Examples of abusive messages across multiple social media platforms

Examples of online abuse and harassment targeting Lauren James following her red card and two-match ban against Nigeria.

Our analysis into the amplification of abuse over the tournament revealed over 1845 instances of harmful content directed towards the network of accounts over the competition in just a month, with this abuse representing 2% of all posts actioned by Resolver over the same period.

Proportion of risk key terms

Graph showing the frequency of key risk terms in harmful content targeting the four footballers between 20th July to 20th August 2023.

The consistent and corrosive nature of online abuse can harm athletes’ wellbeing, alongside their ability to perform at their highest levels. According to research by FIFA and FIFAPRO, being victims of online abuse can lead to a range of off-platform consequences and symptoms for sportspeople, including anxiety attacks, depression, accumulation of trauma, low self-worth, change in sleeping and eating patterns, social withdrawal and isolation, and in some severe cases, even self-inflicted injury and suicide.

A rise in offensive content on the verified pages of athletes also runs the risk of damaging fan experience, which if left unchecked can also hamper the athlete’s ability to gain sponsors risking commercial damage over the long term.

Conclusion

Taking legal action against social media accounts engaging in online abuse can be difficult due to different laws and regulations depending on which jurisdiction a user may be posting from. The use of anonymous accounts, proxy servers, and VPNs add further complexity to the task by allowing individuals to mask their identity and point of origin.

Despite such complications, in January 2021, the UK Government implemented a legal amendment to the Police, Crime, Sentencing and Courts act that extended Football Banning Orders to fans engaging in online abuse. These legal provisions allowed banning orders to be implemented based on racial or hateful speech towards players that occurred online. In March 2023, the Northumbria police announced the first conviction under the new order extending a four-month ban from all stadiums to a fan who racially abused Ivan Toney, a footballer playing for Brentford in the domestic league, on social media.

Professional football clubs and athletes are also increasingly partnering with experts in AI and Machine Learning such as Resolver, to proactively monitor and moderate harmful content across the club and athletes owned social media pages, identify malicious actors in real time, and protect athletes, fan communities and brand sponsors from online harms.