The presence and proliferation of child sexual abuse material (CSAM) on messaging platforms like WhatsApp and Telegram continue unabated, and many groups that were flagged for such material are still active, a new study has found.
CyberPeace Foundation, a cybersecurity think tank that conducted the study, also proposed a techno-legal strategy to tackle the issue, including a mandatory ‘Report CSAM’ button for messaging apps like WhatsApp and Telegram.
The study between June and July 2020 found that as of early June, up to 110 out of a total of 182 WhatsApp group links that were shared with the social media company in 2019 were “still active and operating”.
CPF had conducted two similar investigations in 2019 and found that WhatsApp and Telegram had active groups and channels sharing such material. “Many of these groups that were directly reported to the platform either comprised of adult pornography groups or had group icons that were pornographic,” the organisation said in its latest report. “Almost all groups that had a clear CSAM (child sexual abuse material) group icon or description were removed while groups with obscene pictures and names…remain active.”
CPF has selected 29 groups from a large pool of 1,084 adult pornography groups as a part of its research. Within this, 15 groups were found to be disseminating CSAM and reported to the platform. However, most users in these groups remained active, the report said.
“While none of the groups were still removed, it was later reported to the research group that many offending users were banned. 25 of the 29 users who were reported (along with screenshots of them having uploaded CSAM on a group) remain active on the date of issuance of this report,” it said.
The report was first issued on August 30. In an update, Nitish Chandan, who leads technology, law and policy research group at CPF, said these groups continued to remain even as recently as September 10. “There remain many gaps in reporting and in ban implementation on such content,” he told ET.
WhatsApp said it has taken action on CPF report and that the organisation has “zero tolerance for this heinous crime”.
“When CPF reported these groups to us in 2019, we investigated them and banned those with evidence of policy violations, including child sexual abuse,” a WhatsApp spokesperson told ET in an email response. “Where CSAM was found, we also reported it directly to NCMEC (National Centre for Missing and Exploited Children). We rely on the signals available to us, including profile photos, group information, and user reports, to ban accounts suspected of sharing child exploitation imagery. We appreciate our partnership with the Cyber Peace Foundation (CPF) and we banned abusive users and groups they flagged for us. We are working closely with CPF to confront this scourge and are constantly improving our capabilities to prevent and respond to abuse.”
Telegram did not respond to specific queries on the CPF report.
WhatsApp has two modes of reporting such content – one from within the groups or chats and one through the help-contact section from the app. However, technical limitation to view end-to-end encrypted content means that there is a failure in acting against these groups despite maximum possible escalation, CPF said.
On the Telegram messaging platform, the investigation identified 23 instances of CASM proliferation of which three were shared from India.
CPF reported many such channels involved in spreading pornography and child abuse. Although the reported content was removed, there continues to be an explicit limitation at Telegram in identifying CSAM due to encrypted private chats between two users, it said.
The report proposed a techno-legal framework to identify gaps in policing and reporting of child sexual abuse material in end-to-end encrypted settings in ways that do not violate user privacy.
The suggestions included an option to tag problematic content. The tag could then be used to create a hash value and communicate with respective chat platform and subsequently, law enforcement agencies, it said.
Other proposals included initiatives like a National Tip-Line for last mile reporting of child abuse material and making it mandatory for intermediaries to include ‘Report CSAM’ button in India.
Leave a Reply