Review of popular web platforms finds users face major barriers for reporting child sexual abuse material
For Immediate Release
Winnipeg, Canada — Failures to identify child sexual abuse material (CSAM) as a reporting option, difficult-to-locate menus, and requirements that discourage flagging illegal content: these are some of the major barriers users — including survivors who are finding their own child sexual abuse imagery online — face when trying to report CSAM on some of the most popular web platforms, finds a new report by the Canadian Centre for Child Protection (C3P).
“As we reviewed each platform, we came to realize these companies nearly always provide users with a clear and formal process for reporting copyright infringement. Yet, when it came to reporting images of children being sexual abused, the mechanisms in place were, for the most part, inadequate,” said Lianna McDonald, Executive Director of C3P.
Prompted by feedback from survivors whose child sex abuse was recorded and distributed online, as well as concerns voiced by citizens reporting into Cybertip.ca, C3P undertook a systematic examination of the availability of CSAM-specific reporting mechanisms available on 15 major platforms, including Facebook, YouTube, Twitter, Instagram, as well as adult content sites such as Pornhub.
With the exception of Microsoft’s Bing search engine, none of the platforms evaluated by C3P provided users, at the time of the review, with content reporting options specific to CSAM directly from posts, within direct messages, or when trying to report a user. Rather, platforms generally opted for non-specific or ambiguous language in their reporting tools, C3P researchers found.
Figures by the U.S.-based National Center for Missing and Exploited Children show tech companies reported more than 69 million CSAM images on their systems last year. C3P’s own tipline, Cybertip.ca, has also experienced a dramatic increase in public reports of child exploitation during the course of the COVID-19 pandemic, with an 81 per cent spike this past spring.
“For over a decade, the technology sector has not adequately addressed horrific child sexual abuse imagery being distributed on their services. These titans of tech initially denied the existence of the problem. Then, admitting the problem existed, denied that a technological solution existed. And finally, after being presented with a technological solution, begrudgingly and anemically, began to address the problem of sexual violence against children. More than a decade later, however, too many multi-billion dollars companies still do not have the most basic safeguards like providing a clear and easy mechanism to report child sexual abuse imagery. This is simply inexcusable,” said Dr. Hany Farid, co-developer of PhotoDNA, and professor at the University of California, Berkeley.
As part of the report, C3P developed five recommendations to clarify and streamline CSAM rreporting process for platforms that allow user-generated content to be uploaded onto their services:
- Create reporting categories specific to child sexual abuse material
- Include CSAM-specific reporting options in easy-to-locate reporting menus
- Ensure reporting functions are consistent across the entire platform
- Allow reporting of content that is visible without creating or logging into an account
- Eliminate mandatory personal information fields in content reporting forms
This evaluation comes just months following the Five Country Ministerial’s release of an international set of voluntary principles to counter online child sexual exploitation and abuse. On March 5, 2020, in coordination with governments from Canada, Australia, New Zealand and the United Kingdom, the U.S. Department of Justice released a set of 11 voluntary principles with a goal to ensure online platforms and services have the systems they need to combat online child sexual exploitation.
However, C3P, along with international child protection allies, point out based on the results of this new report, the benchmarks outlined are not yet being met.
“Tech companies who signed up to Five Eyes’ voluntary principles to counter online exploitation have failed the first test and, if their commitments to combat sexual abuse are to be taken seriously, this must be rectified without delay,” said Peter Wanless, Chief Executive of the National Society for the Prevention of Cruelty to Children, the UK’s leading children’s charity.
The full report can be viewed or downloaded at protectchildren.ca/CSAMreview.
For more information, and to arrange interviews in either English or French, contact:
1 (204) 560-0723
communications@protectchildren.ca
C3P is a national charity dedicated to the personal safety of all children. The organization’s goal is to reduce the sexual abuse and exploitation of children through programs, services and resources for Canadian families, educators, child-serving organizations, law enforcement, and other parties. C3P also operates Cybertip.ca, Canada’s national tipline to report child sexual abuse and exploitation on the internet, and Project Arachnid, a web platform designed to detect known images of CSAM on the clear and dark web and issue removal notices to industry.