Artificial intelligence is supposed to detect hate

This year’s Basel Peace Forum dealt, among other things, with the power of emotions. Peace research and the Swiss armaments authority Armasuisse rely on similar approaches to limit the effects of information warfare.

Symbolic image on the subject of cyber*.

Christoph Ruckstuhl / NZZ

“Russia’s five most persistent narratives of disinformation”: Under this title, a compilation of Russian attempts to influence Russians, known as a fact sheet, was recently in the mailboxes of everyone who is on the US State Department mailing list. From an American point of view, the State Department describes how the Russian side has built up a victim myth over the years.

At the moment, the Ukraine conflict is still all about words and emotions. On Friday, the foreign ministers of the two countries in Geneva sent out signals that they were willing to talk. Even the deployment of Russian troops on the border with Ukraine initially serves as a solid argument in information warfare: Russia can invade, but does not have to. The escalation in the network, however, is already well advanced.

Intensified information war

The spokeswoman for the Russian Ministry of Information, Maria Zakharova, tweeted a bitterly ironic reaction to the State Department’s fact sheet: “Whether these are outright lies or just ignorance: Only the Ministry of Truth could have issued something like this.” That seems almost personal – and that’s how it should be.

In their current power-political conflict, the USA and Russia rely on emotions. This is not a new phenomenon. The “War of Words” was an essential part of the Cold War. Disinformation is also part of the gray area between conflict and politics. It’s about deceiving the opponent, sowing doubts and dividing the population. But Internet proliferation and automation have significantly intensified the information warfare.

Combination of peace work and war prevention

The Basel Peace Forum, organized by the Swiss peace organization Swisspeace, also addressed the importance of emotions and disinformation in conflicts last week. Experts discussed how to deal with disinformation and hate at a panel entitled “Navigating Facts: How to Fight Fake News”.

Branka Panic, a political scientist and security expert with a focus on peace work, argued that technology should not primarily be seen as a problem, but rather as part of the solution. Panic is the founder and director of the AI ​​for Peace Foundation. The goal of the organization is the use of artificial intelligence for peace and security.

“Emotions are at the center of conversations,” Panic said on the panel. So it’s about training the algorithms of the social networks so that they are able to filter out hate speech and disinformation. The technology center of the Swiss armaments authority Armasuisse is also working on exactly this approach.

With the help of artificial intelligence, innovative solutions should be found to identify attempts at destabilization and radicalization in social networks at an early stage, the researchers write in a paper Article in the Armasuisse in-house magazine «Armafolio». Among other things, they focus on the classification of so-called memes, which with an emotionalizing combination of image and text are considered an effective means of influencing online narratives.

In this case, too, Switzerland’s efforts to prevent war and peace work complement each other. Under the official foil of the Russian-American network battles, armies of partially automated slit mushrooms troll around. Just tracking down these is a contribution to de-escalating the emotional side of conflicts.


source site-111