Telegram has reportedly removed around 15 million illicit groups from its messaging platform, using AI to tackle the issue. Over recent months, the platform has faced mounting pressure to rid its app of illegal content. As previously reported by ReadWrite, this scrutiny resulted in the arrest of CEO Pavel Durov in France, where he faces charges related to the harmful and unlawful material shared via the app.
While Durov remains under strict restrictions following his first court appearance, the instant messaging service says it has made major strides in addressing the issue. The platform reports having removed over 15 million groups and channels involved in fraud and other illegal activities.
For years, our moderation team removed as many as 1M+ channels and 10M+ users every month. Their hard work stayed behind the scenes — until now.
We’ve launched a new page showcasing their efforts over the years: https://t.co/KH6W4OR1HM
— Pavel Durov (@durov) December 12, 2024
How Telegram has used AI to remove millions of suspected illicit groups
Telegram credited its success to “enhanced with cutting-edge AI moderation tools,” describing it as a step forward in reducing illicit content. This follows a crackdown announced in September, during which Durov expressed the company’s intent to meet government demands for stricter content regulation.
The new moderation page shows Telegram’s attempt at transparency in its operations. In a post on his Telegram channel, Durov stressed that the company was dedicated to combat illegal activities.
He revealed that the moderation team has been diligently working behind the scenes over the past few months, removing “millions of pieces of content that violate its Terms of Service, including incitement to violence, sharing child abuse materials, and trading illegal goods.”
Total groups and channels blocked on Telegram using AI. Credit: Telegram
Durov has pledged to keep users updated with real-time insights into the moderation team’s efforts. According to Telegram’s new moderation page, the platform has ramped up enforcement significantly since Durov’s arrest, and it’s clear the team has been busy. The removal of illicit accounts has been ongoing since 2015, but the numbers are staggering—over 15.4 million illegal groups and channels have been blocked in 2024 alone.
This year, Telegram has also intensified its fight against Child Sexual Abuse Materials (CSAM), banning 703,809 groups and channels. Alongside user reports and proactive moderation, Telegram collaborates with third-party organizations to combat CSAM, leading to thousands of instant bans.
Some of the biggest contributors to these efforts include the Internet Watch Foundation, the National Center for Missing and Exploited Children, the Canadian Center for Child Protection, and Stitching Offlimits.
Telegram’s ongoing moderation efforts
The platform’s commitment to tackling violence and terrorist propaganda is nothing new. Since 2016, Telegram has provided daily updates on these initiatives, earning recognition from Europol. In collaboration with numerous organizations since 2022, Telegram has banned 100 million pieces of terrorist content, with 129,099 blocked in 2024 alone.
Meanwhile, Durov’s legal case in France remains unresolved. While he’s currently out on €5 million ($5.3 million) bail, the platform seems determined to press on with its cleanup efforts.
Featured image: Ideogram
The post Telegram uses AI to remove 15 million suspected illicit groups and channels appeared first on ReadWrite.
{Categories} _Category: Applications{/Categories}
{URL}https://readwrite.com/telegram-ai-remove-15-million-illicit-groups-channels/{/URL}
{Author}Suswati Basu{/Author}
{Image}https://readwrite.com/wp-content/uploads/2024/12/Telegram-uses-AI-to-remove-15-million-illicit-groups-and-channels-900×600.png{/Image}
{Keywords}AI,Pavel Durov,Telegram{/Keywords}
{Source}Applications{/Source}
{Thumb}{/Thumb}