Telegram Pledges Enhanced Content Moderation Amid Legal Scrutiny
Telegram CEO Pavel Durov announced steps to improve content moderation on the messaging platform, acknowledging criticisms and pledging to remove features abused for illegal activities. This comes as Durov faces formal investigation in France for crimes linked to Telegram, aiming to protect the app's image and its nearly billion users.
Telegram CEO Pavel Durov announced on Friday that the messaging app will address criticisms of its content moderation and remove certain features that have been exploited for illegal activities.
Durov, who is currently under formal investigation in France for crimes including fraud, money laundering, and sharing images of child sex abuse, conveyed this in a message to his 12.2 million subscribers on the platform. He emphasized that while the vast majority of Telegram users are not involved in crime, a small fraction's illegal activities tarnish the platform's image, risking the interests of almost a billion users globally.
In a bid to improve the platform's reputation, Telegram has disabled new media uploads to a standalone blogging tool prone to misuse by anonymous actors. The company also removed the People Nearby feature due to issues with bots and scammers, planning instead to highlight legitimate, verified businesses in users' vicinity. Durov's lawyer argued that it was unreasonable to investigate him for crimes committed by others on the app, noting that Telegram is diligent in removing harmful content.
(With inputs from agencies.)