Moscow’s information war: “Russian secret services create networks of false identities”

NATO is said to have provoked the invasion of Ukraine, and Kiev’s army to have carried out the massacre in Bucha – with these allegations Russia is waging its information war. In an interview, Lutz Güllner, head of the East StratCom Task Force, explains what tactics Moscow is using. The task force is part of the European External Action Service and uncovers disinformation on behalf of the EU. Güllner and his colleagues fight their way through a dense network of secret services, media, diplomats and private actors such as the head of the Wagner mercenary group, Yevgeny Prigozhin.

ntv.de: The Kremlin recently claimed that Ukrainian troops are preparing an attack on the Zaporizhia nuclear power plant. Experts see this as a sign that Moscow is planning an attack itself. Russia often uses this method of projecting its own – planned or committed – atrocities onto the enemy. Why?

Lutz Güllner leads the East StratCom Task Force.

(Photo: Lutz Güllner/European Commission)

Lutz Güllner: Russia is not about spreading its own point of view. It’s about deceiving, distracting and creating new narratives. The reference to what one reads or hears is decoupled from reality. Disinformation is sometimes understood as fake news. But it’s more complex: Sometimes it’s not about right or wrong, but about having sovereignty in the information space in order to set a narrative. In Russian tactics there is the mirror effect, that is, you project the criticism you receive onto others, as in the Bucha massacre. That’s when the narrative was picked up that the Ukrainian army did it and it should be blamed on the Russian troops. But that doesn’t mean that the Russian side will automatically do everything it accuses the Ukrainians of doing.

A popular narrative in the Kremlin’s propaganda is Western blame for the war in Ukraine. How does Moscow manage to get through to people in this country with such claims?

Two elements come together here. On the one hand, one deliberately serves target groups that have a certain vulnerability or jump on certain narratives. This applies, for example, to people who are skeptical about NATO. This skepticism is not a problem in itself, it is covered by freedom of expression. The problem is that you specifically approach these people and feed this narrative: ‘NATO has surrounded Russia, it is the aggressor and that’s why you had to fight back.’ There are mantra-like repetitions of such narratives, which eventually solidify. Take the narrative of the so-called ‘Nazi regime’ in Kiev, which can hardly be surpassed in absurdity. It is repeated over and over again until everyone remembers it. That doesn’t mean people believe it, but they’re familiar with that narrative. This familiarity is the first step in disinformation campaigns.

How does Russia go about spreading propaganda?

We need to get away from the idea that disinformation is simply the Kremlin’s point of view and should therefore be classified as propaganda. For the actual problem we use the word information manipulation. This manipulation has four legs. The first pillar is everything that the Russian state apparatus does. This includes the official representatives, with the diplomatic network also being relevant. Second, the state-controlled media play an important role. The third leg is what we call the disinformation ecosystem. The head of the Wagner mercenary group, Yevgeny Prigozhin, is now known to everyone. He has built a media empire, including the so-called Internet Research Agency, which was active in the 2016 US election campaign. It probably still exists in a new form. The fourth leg is intelligence operations. The Russian secret services are active on social media, where they create networks of false identities. On the one hand, information can be generated so easily, on the other hand, disinformation campaigns can be quickly intensified.

Even before the coup attempt by his Wagner mercenaries at the end of June, Prigozhin repeatedly made international headlines. How does he spread disinformation?

He has created companies and structures that engage in information manipulation. His Internet Research Agency is older, probably changed name and location. It’s not easy to find out, because of course it’s not registered as it carries out covert operations. Dozens of people were employed there. They have done nothing but design and develop disinformation campaigns. This structure has been modified over time, but it is still active. The Wagner Group is not only a military organization, but has also made targeted investments in the information sector, for example by buying newspapers and radios. That is, it has also created its own channels and tools to produce and spread disinformation.

What role does artificial intelligence (AI) play in disinformation?

Artificial intelligence has always played a role. The bots that spread disinformation are practically the first form of this technical development. Now there is another boost in automation. Production becomes easier and faster. This is a worrying development. The question is: when can AI be used to go one step further? In the future, deep fakes may be generated as video material so perfectly that you won’t see any difference to the originals. I have a glimmer of hope that the development of AI used for disinformation will develop at a similar rate as AI used to debunk disinformation.

The East StratCom Task Force website educates you and your colleagues on Russian disinformation in 15 languages, and a database lists thousands of fact-checking examples. What other tasks does your team have?

We have three tasks. First, we want to uncover the manipulations, with concrete sources and examples. Our database does not claim to be complete, but is an illustration of tactics. Secondly, the EU is also about regulation, i.e. who has to be responsible for what happens in social networks. Third, there are instruments that we can use in a targeted manner, such as sanctions. They were imposed on Russian propaganda broadcasters RT and Sputnik because they are tools used by the Russian state to support the war of aggression. It is clear that our team alone will not solve the problem. We work closely with civil society actors to ensure that dealing with disinformation and information manipulation becomes an issue that is addressed by society as a whole.

How can individual users arm themselves against disinformation?

On an individual level, a critical approach to information sources is important. You should ask yourself the following questions: Where does the information come from, who is interested in spreading it? Conversely, this does not mean that you automatically have to distrust all media. Think of the term ‘lying press’. Incidentally, one of the goals of disinformation actors is that they want to discredit the entire media landscape. In contrast, you have to develop a healthy reflex and question information, both in social and traditional media. In education, digital media literacy needs to be trained, not just among young people, but also among older people. The East StratCom Task Force provides reconnaissance work. We don’t want to tell people what to think. We want to warn them about manipulation by different actors.

Lea Verstl spoke to Lutz Güllner

source site-34