Escalating Dangers in the Digital Realm
Recent reports reveal a distressing upsurge in child sexual abuse content (CSAM) and online threats against minors, prompting global concern. According to the “Emerging Online Trends in Child Sexual Abuse 2023” report by Thorn, a non-profit utilizing technology to shield children from sexual abuse, minors are increasingly entangled in creating and circulating sexual imagery of themselves, both willingly and under duress, alongside witnessing a spike in perilous online engagements with adults.
John Starr, Thorn’s VP of Strategic Impact, lamented, “In our digitally connected world, child sexual abuse material is easily and increasingly shared on the platforms we use in our daily lives.” This vile content isn’t confined to the shadowy realms of the internet but is pervasive on commonly used platforms.
The Startling Numbers
- The National Center for Missing and Exploited Children (NCMEC)’s CyberTipline has witnessed a staggering 329% surge in reported CSAM files over the past five years.
- In 2022 alone, NCMEC was alerted to over 88.3 million CSAM files.
Factors contributing to this surge include the deployment of tools detecting known CSAM and the bolder moves of online predators, who are leveraging advanced technologies, such as chatbots, to intensify their manipulative tactics. Indeed, the NCMEC saw an 82% increase in reports of online enticement of children for sexual acts from 2021 to 2022.
Technology Fights Back: Hashing and Matching in CSAM Detection
Addressing this alarming issue necessitates the incorporation of technological solutions capable of managing its sheer scale. Hashing and matching emerge as crucial technological means that can assist in safeguarding platforms from hosting and enabling the circulation of CSAM, while also inhibiting its virality and the consequent cycles of revictimization.
Breaking Down Hashing and Matching
Hashing converts a file into a unique numerical string, or hash value, akin to a digital fingerprint. To detect CSAM, content is hashed, and the resultant hash values are matched against lists of known CSAM hash values, allowing platforms to identify, block, or eliminate this illegal content.
Enhancing CSAM Detection
Thorn’s Safer tool, designed for proactive CSAM detection, grants access to a large database, amalgamating over 29 million known CSAM hash values. Safer also facilitates the sharing of hash lists among technology companies, either identified or anonymously, thus broadening the corpus of known CSAM and mitigating its digital dissemination.
In 2022, Safer hashed in excess of 42.1 billion images and videos, locating 520,000 files of known CSAM on customer platforms. So far, Safer has aided its customers in identifying more than two million pieces of CSAM on their platforms.
A Collective Strive Toward a Safer Internet
Thorn insists on the pivotal role of content-hosting platforms in the fight against CSAM. Starr emphasizes, “This is about safeguarding our children. It’s also about helping tech platforms protect their users and themselves from the risks of hosting this content. With the right tools, the internet can be safer.”
The consolidation of efforts between tech companies and NGOs is fundamental to obliterating CSAM from the internet. The broader the utilization of CSAM detection tools across platforms, the higher the likelihood of reversing the distressing ascension of child sexual abuse material online.
Conclusion
Addressing the elevation in CSAM requires an unwavering alliance between technology, organizations, and global platforms, utilizing and innovating tools that impede the creation, distribution, and perpetuation of child sexual abuse material. Together, we can forge an internet that champions safety, inhibits exploitation, and preserves the innocence of youth across the global digital landscape.
Contact us for all your Business Security and Resilience needs.