How Resolver’s Trust & Safety experts uncovered a livestream grooming network on a mainstream platform

Jordan Worthington
Specialist Analyst, Resolver Trust and Safety
· 5 minute read

The growing popularity of live streaming content has captivated millions and reshaped the manner in which we consume media content. However, as online live streaming increases in popularity, so do the risks of predatory behavior targeting minors who stream content. The grooming and exploitation of young streamers by malicious actors who manipulate platforms to target vulnerable individuals is a prime concern.

Livestreaming has transformed how minors seek validation and community, but this engagement can be exploited by groomers who manipulate platforms to target the most vulnerable. Resolver’s Trust & Safety experts recently uncovered a network of predators engaging in collective grooming of minors using the livestreaming function on a Very Large Online Platform (VLOP).

This online engagement can create a unique sense of community, where likes, comments, and shares act as immediate feedback, reinforcing their self esteem and sense of identity. Online groomers look for opportunities to exploit vulnerable minors who livestream, actively engaging with these streams and encouraging the minors to perform specific actions for their gratification. To help our partners identify and combat this harm, Resolver Trust & Safety Intelligence provides expertise to proactively identify and mitigate misuse by predatory communities at scale.

Timely detection of livestream grooming network

The instantaneous and continuous nature of live streaming means grooming can occur in real time, raising the risk of “self-generated” CSAM being streamed on platforms while complicating the efforts of trust and safety systems and teams seeking to prevent, detect and take action against predatory accounts and networks.

Providing partners with the insights necessary to take a swift and proactive response to emerging threats forms the crux of Resolver Trust & Safety Intelligence. Several incidents that saw our experts identify a livestream predatory network on a mainstream platform highlight the efficacy and real-world implications of our services.

Earlier this year, our experts identified a user account on a mainstream platform that featured a minor who frequently engaged in live streaming. The user appeared to live stream everyday from various settings including their home, school and even conducted sleep streams where viewers could watch them sleep at night.

Over time, comments in the live chat function accompanying these streams appeared to have become more inappropriate and sexualised. Our analysts observed viewers attempting to groom the minor and build a community, referring to themselves as a “family”, and to each other with relational names while talking via the live chat function.

Eventually new users emerged who became part of the “family”. Almost all of these users confirmed they were adults while sharing information about themselves. Several users were detected engaging in sexually inappropriate conversations on the live chat. Sometimes these conversations were only directed at each other and not at the minor, demonstrating that the community members themselves were forming relationships with each other. In this manner, the live chat became a predatory community hub.

Our analysts quickly detected and categorized the harmful behaviours as collective grooming targeting a vulnerable minor.

Our analysts categorized the behaviors presented as ‘collective grooming’. This intelligence was swiftly communicated to our platform partner. Our assessment was also able to identify that the minor was likely unable to understand general social norms, and how to communicate or express their emotions appropriately. This made them a particularly vulnerable target as they were more likely to respond to inappropriate requests from users in the live chat without recognising them as such.

Creating personalized content of interest to predators (COITP)

Further review of the network’s on-platform activity revealed that the minor had developed a relationship with one acting member of the group that they referred to as their “girlfriend”. This user was present in almost every live stream; it was clear from the communication between the two that the relationship was based solely online, using a variety of social media platforms to communicate.

Analysts assessed that the predatory user was an adult acting as a minor to facilitate the grooming process. They were observed making COITP requests similar to other users, including asking the minor to show their muscles. Given the presence of several messages in the live chat between this user and the minor that suggested they were in contact using an alternate messaging service, it is highly likely this user was grooming the minor for the purposes of obtaining sexually explicit images or messages from them.

Some of the most significant harmful behaviors observed by our analysts included requests for the minor to unknowingly produce COITP alongside concerted attempts by both the minor account and the predator network to circumvent platform moderation by creating new accounts, and off-site the minor to lesser moderated platforms.

Some of the most significant harms included encouraging the minor to create coitp and off-siting them to a lesser moderated platform.

Over time, the minor had become more confident from the amount of compliments they received by members of the ‘gang’ for performing these requests. The minor engaged in activities that were asked of them, that could be viewed inappropriately by the predator community. This included flexing their muscles, showing their abs and lifting weights, often without explicitly being asked to do so. To be clear, this is a hallmark of established predation, where a user is groomed to proactively support their predator’s desires.

This harmful behavior was encouraged through regular requests from one user who was identified as a UK-based adult male by Resolver. This user would frequently converse with the minor and ask them to perform inappropriate requests on camera.

Resolver identified over 500 instances of such requests from the same adult user over a period of six months and promptly reported this predatory engagement to our partner platform leading to the adult-users’ channel being terminated.

Resolver analysts observed over 500 inappropriate requests to the minor on livestream chat over a six month period.

Creating new accounts to bypass platform moderation

Despite having their account banned following our reports, the adult user proceeded to make new accounts on the platform and attempt to engage in the same violative conduct which led to further terminations; in total the user made in excess of 20 new accounts in order to circumvent the platform’s termination policy, and continue to make COITP requests.

In total the user made over 20 new accounts to circumvent the platforms enforcement of their account.

Every time, Resolver’s analysts identified the evasion, predicated the next attempt, and proactively alerted our partner. As protective action, while the minor’s account was also terminated, given the substantive level of grooming, they also circumvented the termination of their account by creating new accounts in order to maintain contact with the predator.

Through analyzing user behaviors such as activity patterns, language usage, and network connections between offender accounts, our analysts were able to gain a deeper understanding of the nature and extent of violative activities on the platform.

Off-sitting minors to lesser moderated platforms

Another significant observation in our analysis of the predatory network was the consistent manner in which both the minor account and the corresponding community of adult users attempted to circumvent platform enforcement and continue their violative activities on the platform.

For example, despite the minor’s account being terminated by the platform, the predator gang continued to find other ways to communicate with the minor. This included communicating by text, email, and other less moderated platforms.

Predatory networks often attempt to off-site minors to less moderated platforms so they can continue grooming the minor without the risk of being detected. It is almost certain predators use more egregious grooming techniques when being able to operate covertly which is likely to lead to the exchange of CSAM or real world harm to the minor.

This included interactions between users in the comment section of the minors’ videos that had been reuploaded by other users, and encouraging the minor to engage in off-platform interactions with comments such as “can you text me later”.

Exploiting chat functionality to enable Livestream Abuse

Our analysts were able to derive several important insights from our analysis of this livestream grooming network. In particular, the manner in which users who engage in violative behavior once, are likely to continue to repeat this behavior and circumvent attempted moderation by the platform.

Additionally, our investigation revealed that grooming networks will exploit the live chat functionality on live streams to establish a relationship with, and groom underage users on mainstream platforms.

While most platforms offering livestream functionality are built with moderation safeguards that turn off minors’ live chat functionality, this is often tested and exploited by predators for any possible loophole.

Next steps

Resolver’s prompt identification and disruption of a grooming gang exploiting the livestream functionality on a VLOP underlines the efficacy and real-world impact of our Trust & Safety Intelligence.

Our analysis of the malicious network also serves as an opportunity to learn more about how Livestream Abuse can occur across VLOPs including mainstream social platforms and the tactics, techniques and procedures employed by predators to circumvent moderation and continue targeting accounts belonging to minors on the platform. In an ever evolving world of social media, the internet and technology, protecting children online is our priority.

By seamlessly integrating AI technology with over two decades of experience in identifying harmful content online with expert human analysis, Resolver offers our partners robust Trust and Safety Intelligence solutions to identify and counteract potential grooming risks exploiting their live streaming functionality in real-time.

Table Of Contents

    Speak to an expert