“Russia invests into dividing society”

Decoding the Kremlin’s endgame in a year of strategic global elections

Text:

Verena Beck, Head of PR & Communications LOOPING ONE LINKEDIN
Artwork: Franziska Stegemann, Creative LOOPING ONE INSTAGRAM
28.03.2024 5 MINUTES

Sasha Havlicek, CEO of the Institute for Strategic Dialogue (ISD), explains the various layers and tactics of Russian state-run propaganda. She sheds light on the influence the Kremlin exerts over the West (and the upcoming elections in 2024) and how we, as informed citizens, must look deeper than what meets the eye to not be deceived by covert information operations.

In 2024, half of the world's population will cast their vote, making this year a global referendum on the resilience of our political systems. One crucial yet unsurprising event has already taken place: the election of the Russian president. It was the most repressive election in a country overshadowed by two years of brutal war against Ukraine and the death of a symbolic character for all those hoping to see a better future: Alexei Navalny.  


The Russian opposition leader made a hauntingly clear statement in the eponymous documentary Navalny: “You’re not allowed to give up. If they decide to kill me, it means that we are incredibly strong. The only thing necessary for the triumph of evil is for good people to do nothing. So don’t be inactive”.  
What can we, as democratically literate and responsible citizens, do to not be inactive?  


For one, recognizing the decisive power of information. Although the freedom of expression and freedom of information is inherent to liberal democracies, it has become a tool of destabilization and even warfare. We spoke to Sasha Havlicek, CEO of the Institute for Strategic Dialogue (ISD), who is an expert on electoral interference and propaganda, leading research on digital information operations and consulting governments on countering disinformation. Her assessment is clear and alarming at the same time: Russia is waging a shadow war against liberal democracies. A war that is being fueled with (dis)information. 

Sasha, we have just passed the two-year mark of the war in Ukraine, as well as the daunting presidential elections in Russia. What is your assessment of the current situation? 
 
It’s a historic moment with these big, strategically vital elections coming up this year. We are very concerned because Russia’s long-range investment into international propaganda operations is coming to fruition in many parts of the world. It's not just overt propaganda, it's not just what you see from Russian state media. It's the deceptive invisible hand of that playbook of Russian propaganda that we worry about. Of course, it’s had a huge impact within the country for years, but in terms of global electoral outcomes, we worry about this largely outside of Russia.  
 
Is Putin primarily trying to undermine support for Ukraine, or how can we understand the narrative behind this? 
 
While the focus has been undermining Western military support for Ukraine, there is a much bigger objective here. Information is an absolutely integral part of Putin's foreign policy and military strategy, and always has been. With every aggression, Putin’s ground game has been accompanied by a very comprehensive information operations strategy – from the invasion in Georgia in 2008, to the occupation of Crimea, to the perpetration of war crimes in Syria. The purpose of this is to undo the post-Cold War order and to destabilize liberal democracies. And ultimately, with that, to undermine the values that won out at the end of the Cold War, which were democracy over authoritarianism, freedom of speech over censorship. The war in Ukraine is part of a long line of transgressive behaviors that were never really challenged very seriously by Western countries, so they've gone further and further and further. Putin may have been taken off guard by the fulsome nature of the Western response on Ukraine this time around. 
 
What does Russia’s end game look like – destabilizing the West and returning to Russian hegemony?  
 
It is reclaiming Russia’s sphere of influence after the perceived humiliation in 1989 post-Cold War. It is undermining the institutions that uphold that liberal order. Of course, NATO has been a key target all along, but it's a broader objective: If Putin can weaken the countries that underpin this liberal order, if he can create enough mistrust in democratic institutions, that opens up the opportunity for authoritarian nationalist countries and leaders to have a legitimate place on the world stage. And increasingly they see themselves as an alliance against this liberal democratic norm-setting constituency. If they can create a more ‘friendly’ leadership in these countries, then they're in a much better place to not have the type of interference that they see. 
 
What role do information operations play in creating a more favorable leadership for Russian interests? 
 
Putin knows that investing in manipulating public opinion can change outcomes and policies in their favor. Some of the Kremlin’s leading propagandists have talked about how this has been successful in the US withdrawal from Vietnam, for instance. They know that manipulation of opinion is a way to change the course of policy. For instance, we have significant challenges to NATO at the moment, not least with the idea that Trump may come into power with this forthcoming election in the United States. If Putin can undermine the strength of the commitment to that multilateralist approach, he's in a much better position. That’s why it is so existentially important for him to invest in undermining liberal democratic political candidates in favor of authoritarian nationalist political parties.   

How does the Kremlin seek to influence opinions across the West – what channels are there beyond Russian state media such as Sputnik and Russia Today? 

There's three different layers of actors involved in Russian information operations. Think of it as a pyramid. At the very top, there is this overt propaganda by Russian state media and the official Kremlin-linked government accounts. They've been building up an audience in the West for many years, especially successful with those prone on engaging with conspiracism, Covid-sceptic movements and so on. Across the Middle East, the penetration of Russian state media is absolutely enormous. We've seen Russia Today Arabic outpace any of the regional competitors – Al Arabiya, Sky News Arabic, Al Jazeera – massively after the onset of the war. Similar things are happening in South America and parts of Asia.  

These are the commonly known channels of influence. But there is more? 

The overt propaganda is just the tip of the iceberg. The second tier – the middle layer of the pyramid – is essentially content producers who are not explicit in their affiliation with the Kremlin, but they're incentivized in some way, financially, politically, structurally, to spread the same message. For instance, we've tracked these networks of accounts that were pumping out pro-Kremlin Covid-vaccine content disinformation, shifting toward disinformation on Ukraine when the war broke out. Some of that is troll farms, botnets of fake accounts and networks faking popularity, gaming the algorithmic systems of social media platforms so that they can achieve the widest possible reach. They are very effective in doing the bidding.  

And then there's a third layer of pro-Kremlin voices that carry propaganda lines about Ukraine without them having a seemingly obvious reason to do that. And these are often Western networks of influences. One of our studies revealed twelve Western influencers who were creating pro-Kremlin content about Ukraine in a whole range of languages, YouTube videos reaching about 180 million people, their content on Twitter being boosted to around 41 million accounts.  

What makes this second and third layer so dangerous? 

These covert and semi-covert operations are the ones to really worry about because they're not labeled. People don't necessarily believe them to be from the Kremlin, but from often credible voices, people they would trust. The challenge is that most of the policy response has been dealing with the top tier only – for example, banning Russian state media distribution across Europe in the aftermath of the invasion. But as that penetration decreased, you see the equal increase of the impact of informal, broader networks. Disinformation is not limited to fake content, but also includes faking the means of distribution by faking popularity and getting to much larger audiences. With the onset of generative AI, this will become much more sophisticated and much harder to identify. One daunting example of that is the Bucha massacre: False content claiming that the massacre hasn’t happened got significantly higher engagement rates on Facebook across 20 countries than content containing the truth about those atrocities. So, in fact, the denial was shared three times more than the fact. The algorithms only hypercharge that problem. The Kremlin knows very well how to manipulate this online environment, how to play the algorithms in their favor. Essentially, Russia invests into what will divide society. People tend to imagine that this is happening only on the very fringes, but there's significant audiences across the political spectrum. For instance, we’ve done an analysis during the 2017 German federal elections that showed how Kremlin information operations and international alternative-right assets online reinforced each other in support of the AfD. The Kremlin does have a really strong foothold globally. 

Is this a new phenomenon that has only emerged with social media, or has it already existed in more traditional discourses? 

Russian influence has been there for a long time, and it happens in many different ways, not just online. There are more traditional forms of trying to buy politicians. All of this is part of the playbook. It would be wrong to suggest that social media was the only vector for Russian influence, but it has become an extremely useful tool and one that I think has been perhaps more effective than anybody might have imagined. And this points to some of the real inherent problems of how social media works.  

What are those problems from your point of view? 

For one, big tech companies are often failing to enforce their own rules on disinformation and hate. You see pro-Russian propaganda outlets that were banned by Facebook in 2020 coming back onto these platforms over and over and over again. But worse still, the technological architecture, the products, the systems of these social media platforms inorganically amplify harmful, extreme and misleading content. You get some of the worst content targeted at some of the most vulnerable users. And that's worked extremely effectively for disinformation actors.  

Second, we see a closing down of access for researchers to social media data. If we don't have an ecosystem of scrutiny of organizations, research bodies and regulators finding these types of activities online, exposing them to the public, making sure that users understand the ways in which they're being manipulated, the provenance of the information that they're seeing; then we're really operating in a black box, which is what they take advantage of. I think this comes down in good part to regulation of social media.  

Do you think there is a way of regulating social media without limiting freedom of speech and taking away the positive opportunities of this direct discourse? 

Yes, I do. And I think that this is the point. I fundamentally believe in free speech. What we now have on social media is not free speech, it's curated speech. These products and systems of the algorithms are essentially deciding what you see. That is what needs to be addressed to enable a real free speech environment. People need to see much more of the counterpoint, opposing views, so that certain views don't dominate just because they are being manipulated, and enormous resources are being put into faking the popularity of certain ideas. 

People’s consumption of media says a lot. We did a nationally representative poll in Germany. It showed that people who largely trust information from social media over traditional media are much more likely to believe disinformation related to the war in Ukraine. For instance, they're more likely to believe that Zelensky is a drug addict, that Western and Ukrainian officials are involved in illegal trafficking of Ukrainian children in the West, and that the Wagner insurrection was a Western supported plan. For example, 47% of those who responded to have the most trust in social media over mainstream media, believe that Zelensky is addicted to drugs, compared to 4% of those who said that they were unlikely to trust social media over mainstream media. You have a segregated information ecosystem in which people who primarily trust and imbibe their news from social media are going to believe in a different reality to those that might still be engaging with mainstream media.  

Yes, the portrayal of dissidents in Russian narratives is another topic entirely. What tactics does the Kremlin use to undermine them?  

There is a constant slew of disinformation looking to undermine Zelensky's credibility. From outright lies about him using falsified information through to the use of AI to do some of that bidding. But again, this tactic of undermining the credibility of political leaders is something that we've seen a lot in relation to candidates that are considered to be tough on Russia. We saw an enormous campaign against Annalena Baerbock, for instance. We often see gendered disinformation, specifically targeting female candidates in ways that are extremely sexualized and really quite dreadful. And that has been the case, of course, with Yulia Navalnaya since her husband’s death. We've seen concerted efforts to undermine her credibility, and to do that in a highly sexualized manner.  

Speaking of Navalny, what is your assessment on the Russian opposition? Is there even a slight chance of someone cutting through all the propaganda?  

It's very difficult. Putin has an extraordinary grip on the information ecosystem. And because of the repression that we now see in relation to opposition within Russia … again, we have to remind ourselves of the number of people that have been killed. The number of people that have been poisoned. That full domestic repression of any political opposition, which is draconian, has an enormous chilling effect. If you are now just an ordinary Russian citizen and decide to retweet something that comes from a so-called extremist organization, which is essentially any organization these days fighting for democracy in Russia or elsewhere, you risk imprisonment. This absolute hold over the information space is devastating. I think you have to be extremely brave to go up against that today. That’s the problem with authoritarian regimes. Once they really get a grip on power, it's extremely hard to challenge them.  

What do you recommend to our informed and media-savvy readers to avoid ending up in a filter bubble themselves? 

  1. Advocate for the proper enforcement of social media regulation and proper transparency, which is: We as users and citizens should know how these technology giants impact our information space.  
     
  2. Advocate for a very clear understanding of the provenance of information when it is in any way linked to state actors. That labelling is really challenging because they do a good job of working through non-formal channels to get information disseminated. 
     
  3. It’s a platitude, but: make sure you are looking at different sources of freely available information. Be cautious about their origin and know that the platforms are using your data to target you with information. Better digital literacy is critical, not just for children but  for adults. We see adults being sucked into conspiracies and negative information vectors because they don't really understand the business model of social media platforms and how it's affecting them. Lastly: Know that you don't know it all. Know that technology is evolving very fast, and what you see may not be what you think it is.

About Sasha Havlicek

As founding CEO of this leading global counter-extremism ‘think and do’ tank, Sasha Havlicek has spearheaded ISD’s (Institute for Strategic Dialogue) pioneering research and data analysis, digital education, policy advisory, training, tech and communications programmes. With a background in conflict resolution and an expertise in extremism, digital information operations and electoral interference, she has advised a range of governments at the highest levels and has spearheaded partnerships with the UN, EU Commission and Global Counter-Terrorism Forum. 

Ping logo black.

Our newsletter P!NG collects insights from thought leaders. For thought leaders.