The DSA and how Resolver keeps you compliant

April 9, 2024 · READ

In October 2022, the European Union (EU) adopted the Digital Services Act (DSA) as a means of regulating online platforms and services with users based in the EU. The bill, which as of February 2024 applies to all online services that process data from citizens in the EU has brought significant changes to how digital platforms operate across the region. This legislation focuses on increasing transparency and oversight, modernizing the e-Commerce Directive and harmonizing different national laws across the region to establish a level playing field for digital services.

Alongside the DSA, in October 2023, the UK Online Safety Act (OSA) became law. The Act, which covers platforms and services with users in the UK, provides three layers of protection for internet users that will ensure illegal content will have to be removed, place legal responsibility on online platforms to enforce their terms and conditions, and offer users the option to filter out harmful content, such as bullying and self-harm that they do not wish to see online.

The new regulatory frameworks aim to enhance the security of users and help businesses financially flourish by providing legal certainty, clear guidelines, and additional user protections and transparency measures.

Icon 2 100

Demystifying the DSA and OSA

Regulatory frameworks such as the DSA, and OSA aim to mitigate several significant risks arising from our growing use of online platforms and services including:

  • Protect fundamental user rights – The DSA and OSA’s primary directives are to ensure all users’ rights are respected and protected, particularly the right to freedom of expression and privacy.
  • Safeguard children using online services – Several measures implemented in the frameworks seek to shield minors from exposure to harmful content and activities and promote their safety and wellbeing.
  • Reduce exposure to illegal content – The DSA and OSA aim to combat the spread of illegal content, including hate speech, terrorism-related material, and child sexual abuse material (CSAM) across online platforms and services.
  • Provide greater transparency and oversight – Both frameworks establish clear reporting mechanisms that help platforms be more transparent about their content moderation practices and streamline the process for users to flag violative content.
  • Mitigate against additional risks – These acts will also establish additional safety mechanisms aimed at mitigating the risks associated with platform manipulation, mis-and disinformation and other online harms to promote a safer digital ecosystem and informational environment for all users.

Icon 3 100

Moderation for platforms helps keep you compliant

As the implementation of new regulatory frameworks such as DSA and OSA place the efficacy of existing trust and safety measures under scrutiny, organizations with users in the EU and the UK will be forced to adapt quickly to ensure compliance under the new legislations, foster a safer online environment, and avoid punitive financial sanctions on their platforms and services for non-compliance.

Our Moderation for Platforms service offers Trust and Safety teams a suite of moderation features designed to ensure they stay compliant under the DSA, OSA, and other emerging online safety regulations worldwide.

Icon 4 100

Safeguard User Rights

  • Expert moderators paired with advanced AI – Partners can avail of our continuously optimized AI supported by a diverse team of highly trained moderators capable of quickly detecting and removing online risks and providing timely and accurate moderation within clear SLA’s.
  • Flexible moderation policy definitions – Allows our partners to protect their users from online harms and enforce more tailored platform terms of service in a consistent and transparent manner. Our solution can enable platforms to build and maintain an engaged user community.
  • Clear moderation reasons – All moderation decisions provide statements outlining the reason for violation aligned with platform policies. This enables platforms to bring increased transparency when communicating with their users outlining when and why moderation actions have been taken.
  • Support for illegal content reports and moderation appeals – Enables our partners to offer on-platform content reporting and moderation appeals. The solution offers seamless integration to review content and appeals. This further enables users to access timely and accessible redress mechanisms while also reducing the load on operational teams required to handle “off-platform” complaints.
  • Uphold integrity of reporting mechanisms – Our solution allows partners to identify users engaged in the systematic abuse of reporting and appeal mechanisms allowing partners to take action to mitigate against the misuse of their platforms.

Icon 5 100

Reduce exposure to illegal content

  • Accurate, always on risk detection to remove illegal and harmful content – Our analyst-in-the-loop model ensures that partners have access to the best of both human and automated risk detection to ensure partners can consistently apply their terms of service and community standards.
  • Constant evolution of risk detection – We proactively scan for emerging threats ensuring our partners are always the first to know, and the first to act. Our team constantly assesses the online risk landscape to identify the latest tools and exploits used by threat actors to violate platform policies and evade moderation. This solution can integrate with platforms to detect and remove known illegal content such as CSAM and help them enforce policies aimed at reducing exposure to violative content.
  • Prevent and swiftly respond to violative content – Our solution goes beyond content moderation, offering partners actor-level sanctions designed to protect their users from those who repeatedly post violative content or target other users on their platforms.
  • Rapid-alerting mechanisms – We work with partners to establish fast-alerting mechanisms to alert to any content that suggests threat to life or safety allowing them to react quickly and implement actions both on-platform and with relevant authorities.

Icon 4 100

Protect minors on your platform

Both the DSA and OSA are expected to continuously evolve to strengthen their focus on the protection of children online. The protection of minors using online services lies at the heart of our mission as an organization. Our solution allows partners to create policies that ensure enhanced protection for children using their platforms. By incorporating age-appropriate risk detection our partners can shield underage users from violative content and take action against harmful actors and organized networks causing harm to children on their platforms.

Icon 7 100

Enhance your transparency

Access to metrics on moderation actions, notice and actions, appeals and outcomes allowing partners to enhance their transparency reporting mechanisms and build user trust over the long term.

Icon 8 100

Next Steps

As the online platform landscape continues to evolve at pace, platforms and services operating in the EU and the UK must prioritize compliance with the DSA, OSA, and other regulatory frameworks to foster a safer digital environment for their users. With our bespoke Moderation for Platforms solution, organizations can navigate this complex regulatory environment, ensure compliance with new requirements, safeguard user rights, increase transparency and oversight, mitigate established and emerging threats, and promote a safer and more inclusive digital ecosystem.