Unlike Silicon Valley giants such as Facebook and Twitter, which run very public anti-disinformation programs, Brooking said: "Telegram is famously lax or absent in its content moderation policy." On February 27th, Durov posted that Channels were becoming a source of unverified information and that the company lacks the ability to check on their veracity. He urged users to be mistrustful of the things shared on Channels, and initially threatened to block the feature in the countries involved for the length of the war, saying that he didn’t want Telegram to be used to aggravate conflict or incite ethnic hatred. He did, however, walk back this plan when it became clear that they had also become a vital communications tool for Ukrainian officials and citizens to help coordinate their resistance and evacuations. In addition, Telegram's architecture limits the ability to slow the spread of false information: the lack of a central public feed, and the fact that comments are easily disabled in channels, reduce the space for public pushback. WhatsApp, a rival messaging platform, introduced some measures to counter disinformation when Covid-19 was first sweeping the world. Given the pro-privacy stance of the platform, it’s taken as a given that it’ll be used for a number of reasons, not all of them good. And Telegram has been attached to a fair few scandals related to terrorism, sexual exploitation and crime. Back in 2015, Vox described Telegram as “ISIS’ app of choice,” saying that the platform’s real use is the ability to use channels to distribute material to large groups at once. Telegram has acted to remove public channels affiliated with terrorism, but Pavel Durov reiterated that he had no business snooping on private conversations.
from nl