Is this a new phenomenon that has only emerged with social media, or has it already existed in more traditional discourses?
Russian influence has been there for a long time, and it happens in many different ways, not just online. There are more traditional forms of trying to buy politicians. All of this is part of the playbook. It would be wrong to suggest that social media was the only vector for Russian influence, but it has become an extremely useful tool and one that I think has been perhaps more effective than anybody might have imagined. And this points to some of the real inherent problems of how social media works.
What are those problems from your point of view?
For one, big tech companies are often failing to enforce their own rules on disinformation and hate. You see pro-Russian propaganda outlets that were banned by Facebook in 2020 coming back onto these platforms over and over and over again. But worse still, the technological architecture, the products, the systems of these social media platforms inorganically amplify harmful, extreme and misleading content. You get some of the worst content targeted at some of the most vulnerable users. And that's worked extremely effectively for disinformation actors.
Second, we see a closing down of access for researchers to social media data. If we don't have an ecosystem of scrutiny of organizations, research bodies and regulators finding these types of activities online, exposing them to the public, making sure that users understand the ways in which they're being manipulated, the provenance of the information that they're seeing; then we're really operating in a black box, which is what they take advantage of. I think this comes down in good part to regulation of social media.
Do you think there is a way of regulating social media without limiting freedom of speech and taking away the positive opportunities of this direct discourse?
Yes, I do. And I think that this is the point. I fundamentally believe in free speech. What we now have on social media is not free speech, it's curated speech. These products and systems of the algorithms are essentially deciding what you see. That is what needs to be addressed to enable a real free speech environment. People need to see much more of the counterpoint, opposing views, so that certain views don't dominate just because they are being manipulated, and enormous resources are being put into faking the popularity of certain ideas.
People’s consumption of media says a lot. We did a nationally representative poll in Germany. It showed that people who largely trust information from social media over traditional media are much more likely to believe disinformation related to the war in Ukraine. For instance, they're more likely to believe that Zelensky is a drug addict, that Western and Ukrainian officials are involved in illegal trafficking of Ukrainian children in the West, and that the Wagner insurrection was a Western supported plan. For example, 47% of those who responded to have the most trust in social media over mainstream media, believe that Zelensky is addicted to drugs, compared to 4% of those who said that they were unlikely to trust social media over mainstream media. You have a segregated information ecosystem in which people who primarily trust and imbibe their news from social media are going to believe in a different reality to those that might still be engaging with mainstream media.