Telegram has integrated new moderation algorithms: how the platform reacts to rule violations
13 December 05:16In 2024, Telegram blocked more than 15 million channels and groups, half of them for materials about sexualized child abuse. This is stated on the messenger’s website, Komersant ukrainskyi reports
Moderators are focused on combating incitement to violence, dissemination of materials about violence against children, and countering the trade in illegal goods.
According to Telegram’s official data, in 2024, 15,383,034 groups and channels were blocked for violating the platform’s rules. This result was made possible by the synergy of technological solutions and the active participation of the user community. Every day, the platform blocks tens of thousands of objects that do not comply with the Terms of Use, which demonstrates the scale of the problem and the company’s determination to fight it.
Telegram focuses on removing these types of content:
- Child abuse material – Moderation of this category is a priority given its extreme social harm.
- Incitement to violence – the company actively counteracts groups that spread calls for crimes or provoke social conflicts.
- Trafficking in illicit goods – the platform blocks channels that sell drugs, weapons, and other prohibited items.
In 702,000 blocked channels and groups, moderators found materials about sexualized child abuse, and another 129,000 were blocked because of terrorism.
It is noted that since 2015, Telegram has been using a combined moderation system that combines user reports with proactive monitoring based on machine learning. This system allows detecting and blocking violations before they become a large-scale problem. In 2024, the company made a significant step forward by integrating advanced moderation tools based on artificial intelligence.
New technologies have improved the ability to automatically identify violations, which has significantly increased the efficiency of moderators.