Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Local Newscast
Hear the latest from the WRKF/WWNO Newsroom.

Digital Threats On 2020 Elections

MICHEL MARTIN, HOST:

In 2016, many Americans became aware of how easily distortions and outright lies could be spread online to influence a national election. Since then, many countries have had experiences with this. And critics worry that not enough is being done by online platforms like Facebook and Twitter to make sure this doesn't happen again in 2020, a year when more than 60 countries will hold elections, the first of which took place in Taiwan today. And we want to note here that Facebook is among NPR's recent financial supporters. Siva Vaidhyanathan has been writing about these concerns. He is a professor of media studies at the University of Virginia. We spoke earlier about why he thinks digital democracy will face its greatest test in 2020.

SIVA VAIDHYANATHAN: Well, in a lot of countries around the world, the media ecosystem has either never fully flowered into the sort of, you know, multi-voiced system that we have in the United States or in Western Europe, where there's lots of independent radio and independent television, lots of independent newspapers and magazines. Many of these countries have - were under state control for a long time or remain under state control. And Facebook is essentially the Internet to a lot of these people in a lot of these countries. Or their entire media systems have collapsed because Facebook took all the money or attention.

So I'm thinking about Myanmar and Sri Lanka and Pakistan and the Philippines, Indonesia, Cambodia, Kenya, Nigeria - I mean, huge, huge countries with massive populations, all of which have fairly flimsy democracies, if they have democracy at all, and are all encountering all sorts of social turmoil, economic pressures, et cetera. They are the perfect sites for the sort of nasty, divisive propaganda that can flow so easily through Facebook but also through WhatsApp, also on Twitter, also through YouTube.

MARTIN: So a couple of weeks ago, Facebook set up a war room in Taiwan to counteract disinformation leading up to the election. They did the same in California during the U.S. 2018 midterm election. So, first of all, what happens in these war rooms, and do they make a difference?

VAIDHYANATHAN: You know, no one's allowed into the war room to check on what they do. You know, I have to assume that they put a whole lot of security experts in there. And they try to trace what they call inauthentic content - in other words, content that has its origins in some enemy country or hostile country. So for Taiwan, if they notice that Facebook or WhatsApp are being flooded with videos and other forms of propaganda and the origin seems to be the People's Republic of China or some client state of the People's Republic of China, they might get suspicious and then watch that content and then start blocking it so it doesn't affect the way people think about politics in Taiwan too deeply, right?

But it's - we have no way of knowing how effective these efforts are. Facebook has never been good about allowing researchers, social scientists or journalists to examine the actual data inside the company that might show us something, that might give us a sense of whether it's even possible to do anything about this sort of flood of propaganda from other countries or even domestically.

MARTIN: Do you have a sense of, you know, based on your research and that of others, whether there is some throughline to these groups that are engaging in these disinformation campaigns around the world? Like, what's their end goal? Do we - is there, say, a single source or a few sources - is there anything - you know, what do we know?

VAIDHYANATHAN: There doesn't seem to be a single source, but there seems to be thematic coherence. In other words, if there is an extreme authoritarian political force in the Philippines - and there is - and there's an extreme authoritarian political force in Ukraine - let's say trying to be imposed from across the border in Russia - those forces are going to learn from each other. It's very easy to mimic the strategy of another one. So what it means is if you're of that ilk, if you want to disrupt democracy and undermine any form of governance that might support the rule of law and limit corruption, et cetera, you are going to try to flood the political sphere with nonsense, with stuff that will divide society, stuff that will turn people against each other, especially against minorities or against immigrants.

So we've seen this time and time again, since as early as 2011, in Estonia, in Ukraine, in Poland, in Hungary, in the Philippines. And in the Philippines, it really worked well in 2016. So Rodrigo Duterte won in 2016 running a campaign almost entirely on Facebook - with the assistance, by the way, of Facebook staff. So that's like the high point of shenanigans - anti-democratic, pro-authoritarian shenanigans working through Facebook.

MARTIN: What about in Taiwan? As we mentioned, the first election of 2020 was held in Taiwan today. Did we see evidence of these kinds of things?

VAIDHYANATHAN: I wasn't too worried about Taiwan being susceptible to propaganda. But I definitely was worried about the fact that China would do its best. So we know that China has been doing all it can to disrupt the protests in Hong Kong with propaganda. But my sense is that the people of Taiwan are so aware of the issues and so on guard about propaganda that whatever efforts were put forth probably had minimal influence. Nonetheless, we have to be worried about the internal politics of Taiwan over the long term because certain forms of dangerous nationalism can rise within Taiwan very easily, given the fact that almost everybody in Taiwan relies on Facebook.

MARTIN: So moving forward in 2020, as where we sit right now - obviously, it's an election year in the United States. And I assume most Americans are very interested and concerned in that. Are there some things that people should be looking out for?

VAIDHYANATHAN: Yeah. We're likely to see in 2020 much of what we saw in 2016, which means the development of Facebook groups that are large are going to be outside of the arm - the regulatory arm of Facebook itself. Facebook groups devoted to dividing Americans, you know, fostering, you know, Texas independence movements or anti-Semitic movements or any of those efforts that, like, hit viscerally at our identity as Americans. And those are likely to be supported by Russian influence or maybe even domestic extreme influence.

And look. The goal is not necessarily to reelect Trump. It wasn't really the goal to elect Trump in 2016. The goal was to mess with us, so that no matter who becomes president, the United States is harder to govern, and that over the long run, democracy becomes harder to sustain.

MARTIN: That Siva Vaidhyanathan, media studies professor at the University of Virginia. But he was kind of to join us from our studios in New York City. Professor Vaidhyanathan, thank you so much for talking with us.

VAIDHYANATHAN: Oh, it's my pleasure. Thanks. Transcript provided by NPR, Copyright NPR.