Corona opponents of measures – hate machine Telegram: thousands of calls for violence discovered – News

  • An SRF data analysis shows: Several thousand calls for violence and death threats are circulating in the channels of corona measures opponents on Telegram.
  • They are aimed at politicians, journalists, scientists or the police.
  • The mood in the channels is becoming more toxic – influenced by conspiracy ideologues, right-wing extremists and a small circle of radicalized opponents of the measures.

Sometimes an emoji says more than a thousand words. A red head ?, a bomb ?, an explosion ?. Such aggression-laden “hate” emojis have become increasingly popular during the pandemic. This shows a data analysis of around one million telegram messages from over 90 channels from opponents of measures in Switzerland.

In particular, the users of Telegram reacted with hatred to political measures such as the obligation to wear masks (autumn 2020), the obligation to obtain certificates (autumn 2021) and the start of the vaccination campaign (early 2021).

Telegram is a so-called ‘dark social’ channel, a messenger service that can also be used as a social network. Opponents of the corona measures are now mainly meeting there because much of the content they shared on platforms like Facebook has been deleted. Various estimates assume that there are between 50,000 and 70,000 Swiss-Germans who exchange information there in channels and closed chat groups, or at least read along. The SRF data analysis shows that the telegram channels of the opponents of the measures have become larger and larger in the last two years and have greatly expanded in reach. And the atmosphere there has become more toxic.

What does “toxic” mean and how was it measured?


open box
close the box

SRF Data has an algorithm from the Google project PerspectiveAI used. This allows the collected messages to be checked for their toxicity, i.e. toxicity. Toxic are rude, disrespectful, or inappropriate comments that can cause a group member to leave a discussion. For example, insults, threats, vulgar language – everything that one would find rude and unpleasant even in a personal conversation. In addition, the algorithm identifies the messages as well as calls for violence and threats.

Minority of users write majority of hate messages

At the beginning of the pandemic, the proportion of toxic messages in the channels examined was around one percent. After the vaccination campaign, the proportion has doubled. Every 50th message is now toxic. However, there are big differences depending on the groups/channels and users. Around eight percent of the users are responsible for around 82 percent of the toxic messages – that’s around 860 people. What is striking: These people have also written many other messages. This suggests that those people who comment a lot on these channels are also more likely to spread hate speech.

But the analysis also shows that 85 percent of Telegram users have not written a single toxic message. And there are numerous channels among the opponents of the measures, whose mood is peaceful. So it’s a small, highly active part that’s becoming more and more toxic – but it degrades the mood and normalizes the aggressive tone.

Growing influence from the right fringe

One reason for the high proportion of toxic messages can be found in the immediate vicinity of the telegram groups of the opponents. In their environment there are numerous chat groups of conspiracy ideologues and right-wing extremist groups. Content posted in these chats also often finds its way into the Corona groups, including anti-Semitic, homophobic and racist content. Many of the subscribers to these groups are also active users of the Corona groups. This is confirmed by a network analysis. Of around 30 chat groups whose active members overlap, 15 have a very high level of overlap – including the chat group of a well-known right-wing extremist organization. Many members of this organization, which was already active before the pandemic, have also joined corona groups and are actively posting content there.

And some of the 860 particularly toxic users can also be found in the orbit of the right-wing extremist scene. Some of it can be attributed to a group that has attracted attention in recent months. The «Swiss Mens Club of Freedom» or «MännerWG». They appear at Corona demos and sometimes act as a security service. Many members of this club can also be found on Telegram and are among the best connected people there. They maintain contacts in the various communities and are sometimes spokesmen. Your toxic news gets so much attention.

Thousands of incitements to violence and death threats identified

Among the messages examined are thousands of incitements to violence and death threats. Every 200th message on Telegram is now a threat or an incitement to violence. Around 90 percent of these views can be found in 10 of the more than 90 groups and channels. SRF has classified hundreds of threats. These are aimed at specific individuals, above all politicians, scientists or journalists, but also at institutions such as the police. Concrete examples of incitements to violence and death threats are given in the box below to show how the algorithm evaluates them. SRF has anonymized those who are threatening and those who are threatened and in [Klammern] context added.

Some of the threats are easy to identify, for example because they are on Telegram with their real names or have posted other identifying information. The “Rundschau” visited several of them and confronted them in front of the camera. They argue in self-defense or deny having written the content.

Six times more reports than before the pandemic

A radicalization of some of the opponents of the measures has also been observed at the Federal Office of Police. Stéphane Theimer, head of the federal security service at Fedpol, told SRF: “We notice that the tone has become increasingly sharp and aggressive.”

Fedpol itself only views publicly accessible platforms. With partially closed platforms like Telegram, it gets support from the federal intelligence service. A big challenge is the mass of messages. You have to look through thousands of posts every day, says Theimer. “We analyze and assess every report that we receive. If we determine a criminal offense, the police will investigate and criminal proceedings will be opened. This goes as far as a national search, arrest and conviction.”

The Fedpol becomes particularly active when there are threats against the Federal Council, parts of the government or parliament. And that is increasingly the case. While Fedpol received 246 reports of potential threats against federal councilors or members of parliament in 2019, in 2021 there were more than 1,200 such reports across all social media. Of these, 120 were identified as actual threats and prosecuted, for example with criminal proceedings and police investigations, or through a so-called threat address. This corresponds to a sixfold increase compared to 2019.

Whether one can identify those who are threatening also depends on the cooperation of the individual platforms. Telegram is not very cooperative, which makes things more difficult. Nevertheless, one would also be able to identify threats there, says Stéphane Theimer: “People have the feeling that if they use Telegram, they remain anonymous and safe – which is not the case.”

But by no means all calls for violence and threats are followed up. For example, when the threat remains too vague – or the group of those threatened is too large. Most hate messages linger on Telegram and inspire imitators.

Associated podcast series: ‘Dark Social’ by SRF Data

source site-72