Discord, the popular communication app launched in 2015, has become a hub for online gamers and diverse communities worldwide. However, a darker side has emerged, with adults exploiting the platform to groom children, trade child sexual exploitation material (CSAM), and extort minors.
According to NBC News, at least 35 cases of kidnapping, grooming, and sexual assault involving Discord communications have been identified in the past six years, with 22 occurring during or after the COVID-19 pandemic. Additional investigations have uncovered 165 CSAM transmission or receipt cases, including four crime rings.
While these numbers represent reported and prosecuted cases, experts believe they only scratch the surface of the problem. Discord’s young user base, decentralized structure, multimedia communication tools, and the recent surge in popularity make it an attractive platform for individuals looking to exploit children.
Reports of CSAM on Discord increased by 474% from 2021 to 2022, as per the National Center for Missing & Exploited Children (NCMEC).
Discord’s responsiveness to complaints has been a concern. The platform’s average response time to complaints increased from three days in 2021 to nearly five days in 2022, according to NCMEC.
Watchdog organizations have emphasized the need for stronger moderation and proactive detection of child exploitation content, highlighting the urgency to address the issue.
Discord has taken steps to address child abuse and CSAM, disabling over 37,000 accounts for child safety violations in the last quarter of 2022.
The company has also collaborated with Thorn, a developer of anti-child-exploitation technology, to detect grooming behavior and harmful content. However, more needs to be done to tackle the persistent problem.
Despite Discord’s efforts, concerns remain about its lack of oversight and the ease with which children under 13 can create accounts. The platform’s safety measures largely rely on community members flagging issues and do not monitor every server or conversation.
While Discord plans to implement new models to detect child safety threats and undiscovered trends in child exploitation content, watchdogs and officials assert that safety measures should have been incorporated from the platform’s inception.
Discord has faced criticism for its slow response times, communication issues, and hosting of communities engaged in CSAM. Organizations like Inhope have declined to partner with Discord due to these concerns.
Discord acknowledges the areas that need improvement and is working on updating its child safety policies and implementing Thorn’s grooming classifier. The issue of child exploitation on Discord highlights the need for greater vigilance and stronger measures to protect vulnerable users.
Experts stress the importance of proactive detection, rapid response to reports, and collaboration with law enforcement and tiplines to combat this pervasive problem.