Over 15 million offensive channels, groups were removed in 2024: Telegram

Over 15 million offensive channels, groups were removed in 2024: Telegram

Since its inventor, Pavel Durov, was detained in France and charged with spreading dangerous content on his messaging app, Telegram has been under unprecedented pressure to clean up its platform this year.

In a statement released Thursday on Durov’s Telegram channel, Telegram claims to have eliminated 15.5 million groups and channels associated with hazardous content, such as fraud and terrorism, in 2024 after announcing a crackdown in September. The company claims that this effort was “enhanced with cutting-edge AI moderation tools.”

According to the message from Durov’s Telegram channel, the statement is part of a recently released moderation website Telegram created to better inform the public about its moderation efforts.

Read also: Kenya suspends Telegram during KCSE exam hours to curb malpractice

Since Durov’s arrest in August, there has been a discernible uptick in enforcement, according to Telegram’s moderation page. Durov is currently free on €5 million bond, but his French case is still pending.

Part of the statement reads, “Over the past few years, our moderation team has been working tirelessly to keep Telegram safe. Each month they removed about 1 million channels and groups, along with over 10 million users who violated our rules. These impressive results were made possible thanks to your reports, as well as our automated detection systems and AI-powered tools.”

Telegram blames media for under-reporting its moderation efforts 

Telegram berates the media outlets for relying on the information on its website, which hadn’t been updated for years, as the public is not aware of the depth of its moderating activities.

It stated, “However, much of this work remained behind the scenes. The public wasn’t fully aware of the extent of our moderation efforts, and media outlets often relied on outdated information from parts of our website that hadn’t been updated in a decade.”

“To address this, we’ve launched a new section on our website: telegram.org/moderation. This page highlights the incredible work our moderators have done over the years and offers a transparent overview of our ongoing commitment to keeping Telegram safe,” the statement added.

707,576 Child Sexual Abuse Materials-related channels and groups deactivated in 2024

Child Sexual Abuse Materials (CSAM) are strictly prohibited on Telegram. The platform claims to have banned 707,576 CSAM-related groups and channels in 2024.

Since 2018, a hash database of CSAM that has been prohibited by its moderators during the previous ten years is automatically compared to public images.

It added hashes from groups such as the Internet Watch Foundation to its database in 2024.

Telegram handles hundreds of CSAM allegations from international third-party organisations using automated takedown addresses such as abuse@telegram.org and stopCA@telegram.org in addition to its proactive measures and user reports.

“Telegram publishes daily transparency reports on the removal of CSAM content,” it said.

Read also: Telegram sees 189% growth in African crypto groups as youths embrace digital currency

130,119 terrorist-related communities banned in 2024

Telegram is not the appropriate platform for terrorist propaganda or calls for violence. The social media platform claims to block 130,119 channels and groups that are used for terrorism-related activities.

Daily transparency updates on the removal of terrorist content have been released by Telegram since 2016. Telegram has received recognition from Europol for its anti-terrorism efforts.

Since 2022, Telegram has greatly expanded its work in collaboration with groups such as the Global Centre for Combating Extremist Ideology, or ETIDAL. Over 100 million pieces of terrorist content have been eliminated by Telegram’s moderators working with ETIDAL alone.

Leave a Reply

Your email address will not be published. Required fields are marked *