No time right now?
Apple has been sued in a US district court. The aim of the lawsuit: The iPhone group should also remove Telegram from the app store after Parler. A lawsuit against Google is being planned.
After the unrest before and in the US Capitol, Apple and Google threw the Twitter alternative Parler, which was also popular with Trump supporters, from their app stores. The reason is that Parler does not adequately moderate posts that incite violence. That in turn would violate the guidelines. An activist group also accuses Telegram of a similar failure in removing posts that glorify violence – and has sued Apple.
Hate speech on Telegram, lawsuit against Apple
The suing initiative is called the Coalition for a Safer Web. It is your stated goal according to the Washington Postto keep extremist content out of social media. Marc Ginsberg, a former US ambassador to Morocco, is the president of the Coalition for a Safer Web. According to Ginsberg, Telegram stands out as a propagator of hate speech even compared to Parler. With such content, Telegram – like Parler – violates Apple’s app store rules, according to the lawsuit. A similar lawsuit against Google is planned.
In the case of the lawsuit filed before the District Court of Northern California, the Coalition for a Safer Web also alleges that the content shared via Telegram negligently caused emotional harm to people and also violated the California business code. In the lawsuit compensation for damages not specifically specified is requested. In addition, Apple is to be made to throw Telegram from the app store by means of an injunction. Apple and Telegram have not yet commented on the lawsuit.
Right to freedom of expression
For example, Apple does not per se require apps like Telegram to remove inappropriate content. All you have to do is provide a way for users to report such content. In addition, contact information must be offered and users who violate the rules must be able to be removed from the platform. When Parler was kicked out, Apple argued not with potentially unsuitable content, but with the lack of a system to moderate dangerous content.
Platforms take action against radical content
Most recently, Telegram is said to have deleted dozens of public chat channels in which right-wing extremists had called for violence. Previously, Twitter had deleted more than 70,000 accounts of supporters of the conspiracy theory QAnon. Facebook, in turn, deleted content about the unrest around the Capitol – for security reasons, as it was said.
Also interesting: radicalization on Telegram – Nazis, weapons, drugs and Attila Hildmann