Social messaging platform Telegram has implemented significant changes to its private chat regulations following the recent arrest of its CEO, Pavel Durov.
The company has extended its content moderation policies to include private chats, which marks a departure from its previous stance on user privacy. According to the updated FAQ section on Telegram’s website, the platform will now allow users to flag content within private chats that they believe to be illegal. This update includes the introduction of a report button across all devices, including Android, iOS, and desktop applications. Users can also email Telegram’s automated takedown service with links to content they believe violates the platform’s policies.
Telegram’s policy shift on private chats
Telegram maintained that all private and group chats were strictly confidential and that the company would not process any moderation requests related to them. The FAQ previously stated, “All Telegram chats and group chats are private amongst their participants. We do not process any requests related to them.” This new policy indicates a shift in Telegram’s approach to content moderation, reflecting a broader effort to address concerns about illegal activity on the platform.
Background of Durov’s Arrest and its Impact
Telegram’s private chat policies changed following Durov’s arrest in France. French authorities accused the messaging app of being a conduit for various forms of illegal activity, leading to Durov’s detention. This incident prompted Telegram to reconsider its moderation practices, particularly concerning private communications on the platform.
Following his release, Durov made his first public statement via Telegram. He expressed confusion over the situation in France, stating, “I was told I may be personally responsible for other people’s illegal use of Telegram because the French authorities didn’t receive responses from Telegram.” Durov also highlighted that Telegram has an official representative in the European Union responsible for responding to legal requests, suggesting that the French authorities had multiple avenues to contact the company for assistance.
Durov’s response and future actions
Durov criticized the actions of the French authorities, arguing that legal action should be directed at the service itself rather than its CEO. He emphasized that Telegram is committed to releasing harmful content, stating that the platform removes millions of posts and channels daily. Durov also acknowledged that Telegram’s rapid growth to 950 million users has led to challenges in content moderation, which criminals have exploited.
Durov concluded that improving the platform’s ability to handle these challenges is now his priority. This indicates that further updates to Telegram’s moderation policies may be forthcoming as the company seeks to balance user privacy with the need to comply with legal obligations.