This week, OECD, the Organisation for Economic Co-operation and Development, released a vital report that benchmarks the child sexual exploitation and abuse (CSEA) policies and practices of the top-50 online content-sharing services across the internet. The results underscore significant and ongoing issues facing these services, the strides many companies have made and the opportunity for growth for others.
The report also highlights one of the core challenges content-sharing services and the ecosystem of technology and service providers face when dealing with CSEA content. Across the board , there are inconsistencies in transparency reports and, in some cases, no detailed policy on CSEA. Classification of content, the language used to describe both the offenders and the content they share or create varies, and compliance with guidelines scant. The result is a limited understanding of what CSEA-related content and activity is permissible or actioned by a given platform and, worse, risk vectors impacting children online. That confusion can have a network effect across content-sharing services that exposes both platforms and by extension children to new risks.
In February of 2022, Resolver participated in Safer Internet Day with the WeProtect Global Alliance, identifying a number of key trends for CSEA including the acceleration of online harms to children during the Pandemic, the need for more government regulations and the growth of new safety tech. Strides have been made but the OCED reports makes it clear that more work needs to be done.
Addressing CSAM content, but critically also associated Tactics, Techniques and Procedures (TTPs), and developing resilient policies and solutions is a complex and challenging task. Offenders constantly seek out new tools, mask and adapt their language to evade filters and moderation, employ fake profiles and exploit new technologies such as cloud sharing or generative AI. Organizations must build moderation policies and employ technologies that answer the difficulties they are faced with today, but critically ensure that will be able to sustainably answer the challenges of tomorrow.
The interconnected nature of services is why Resolver is co-chairing the World Economic Forum’s Global Coalition of Digital Safety Workstream. One of the purposes of that work is to establish a common language of online harms across sectors. This detailed typology helps all stakeholders describe, understand and address online harms including CSEA and CSAM content on their own platforms through the agreed lexicon. A common language can also help to simplify the work content-sharing services need to do to detect and report on CSEA.
Global regulators are increasingly focused on a lack of uniform progress in the space, and lack of action is becoming increasingly visible within the market, and questions are being asked by legislators worldwide about this. Beyond the language practitioners use, now is the time to evaluate current policies and determine if those policies are sufficient to address the challenges we face today and ready to adapt to the challenges of tomorrow. Resolver can help organizations improve their detection, classification and reporting of CSEA, improving transparency and compliance with industry guidelines.