Nina Jankowicz 鈥11 Publishes Book on the Information War
Her new book takes on Russia, fake news, and the future of conflict.
In the following Q&A, she talks about how disinformation works, why we're vulnerable, and what to do about it.
What They're Doing
My book specifically describes Russian influence operations in Central and Eastern Europe, putting the revelations about Russian interference here in the U.S. into context for American readers. Often what our media coverage misses about Russian influence operations is that they go far, far beyond botnets and troll farms鈥攖he types of 鈥渇akes鈥 we became familiar with in the 2016 election. In the examples the book describes in Estonia and the Republic of Georgia, for instance, Russia not only uses disinformation (the use of false or misleading information with malign intent) to influence political outcomes; it also launches cyberattacks, foments protest, and infiltrates cultural and media organizations.
Though the book specifically focuses on Russia, another important thread between all of the case studies in it is that disinformation is harmful to democracy, whether it comes from within or outside the house. The countries that fight foreign disinformation while tacitly endorsing its tactics for their own domestic political use don鈥檛 get very far in dealing with either, undermining the free flow of information and ultimately the democratic process itself.
Why We鈥檙e Vulnerable
This touches on another misconception we in the United States have because of the widespread use of the term 鈥渇ake news.鈥 Disinformation isn鈥檛 about creating fake journalism or manipulating photos; the most successful disinformation runs on emotion and exploits societal fissures like endemic racism, economic inequality, and political gridlock and polarization. Unfortunately, the U.S. has plenty of fertile ground for disinformers to exploit.
We鈥檙e also vulnerable because of the changes to our information environment and how slow society has been to adapt to them. Social media platforms serve as a primary information source for millions of people, and these platforms鈥 business models prioritize and incentivize engagement with emotional content, which performs better and keeps people on the platform, looking at ads, making tech companies money. Without changing these business models or introducing some regulatory checks, it鈥檚 up to individuals themselves to suss out what information can be trusted, and many Americans don鈥檛 have the media literacy skills or the tools to do that. They are used to mainstream media serving as a gatekeeper for their information and treat the content they encounter on social media with the same trust.
This is essentially how all disinformation operations work鈥擱ussian or otherwise. In the U.S., in 2016, Russia used racial tension and targeted Black Lives Matter supporters, even creating a Facebook page that had more followers than the official BLM account. They latched onto similar narratives when protests about the murder of George Floyd erupted this year. Ultimately, the goal is to pit Americans against one another and encourage dysfunction in our democratic process, but the model applies in other countries as well: Russia used ethnic tensions between Estonians and ethnic Russians to provoke unrest surrounding the removal of a Soviet statue in Tallinn in 2007.
How to Fix It
There are obvious solutions, like holding the perpetrators of foreign interference campaigns to account; under the Trump administration, the White House itself has undercut these efforts. While the US has imposed sanctions on Russia for its malign activity, the President鈥檚 praise of Putin and assertion that Russian interference is a "hoax" has created an incongruence in our Russia policy. We are not sending a consistent message, making Moscow even less likely to heed it. We need recognition from the very top of government for these efforts to hold water. But we also need to stop securitizing this problem; this isn鈥檛 only a national security issue, one that the Departments of Defense and State should be dealing with. We also need to think about how to build America鈥檚 societal resilience, investing in media literacy and civics courses for children and voting age adults alike, making sure Americans have access to trustworthy information by beefing up the budgets for our public broadcasters, and looking inward to try to lessen the vulnerabilities that bad actors like Russia exploit.
On an individual level, people should really be cautious when sharing information online, asking themselves if it is coming from a reputable source (does it have a masthead? contact information? has the author written anything else?) and why that source or user might be targeting them. If they feel themselves getting highly emotional, these questions are especially important, and it might be time to step away from the keyboard and practice what I call 鈥渋nformational distancing鈥濃攑utting physical distance between yourself and emotional content that you might be tempted to share, until you can do your due diligence. Disinformation often gets its legs not through paid advertising, but because it is shared by normal users.
November 3
I鈥檓 really disturbed that we are seeing disinformation shared by American politicians and officials. This sort of behavior should be anathema to everyone. Disinformation knows no party鈥攊ts ultimate victim is democracy. No matter the outcome of the election, we will be dealing with the consequences of this behavior for years to come, because disinformation undermines trust in institutions, a basic level of which we need for our government and democratic process to function. In particular, I鈥檓 worried about trust in the results on and after election night. We should all do our best do get our information from official sources (state and local election boards) rather than politicians and pundits.
The Human Face of Fake News
Out of graduate school, I worked for the National Democratic Institute, an organization that provides training and support to democratic activists around the world. I worked on programs in Russia and Eurasia. We were often the victims of Russian propaganda, which sought to paint us 鈥淐IA-sponsored instigators of color revolution鈥 (we weren鈥檛). I鈥檝e always been interested in the effects of social media on society, so when Ukraine鈥檚 Euromaidan revolution happened, I felt a strong pull to go there and work on issues related to disinformation. As a Fulbright Public Policy Fellow, I advised the Ukrainian Foreign Ministry on strategic communications issues. I watched from Kyiv as the U.S. election unfolded and America woke up to the threat of information warfare. That鈥檚 where the idea for the book was born.
While some people who study disinformation focus on network analysis and the technical side of things, I try to bring a human face to these highly technical stories and make them more accessible to curious, non-expert audiences. My research style is more sociological in that way; it is largely interview-based, though I work with primary source documents as well.
From 老王论坛 with Love
老王论坛 was undoubtedly one of the biggest influences on my career! As a freshman, I didn鈥檛 intend to double major in Russian and political science, but the Russian department was too interesting and inspiring to not! I loved my classes with Sharon Bain, Tim Harte, and Dan Davidson (his language policy class gave me my first exposure to ethnic issues in Estonia!), and my time in the Russki Dom was responsible for some of my favorite 老王论坛 memories. Also, without my Russian language skills and knowledge of Russian culture I would not be able to do half the work I do today.
The political science department was also great; I still think back to course readings from Marissa Martino Golden and my thesis advisor, Carol Hager鈥檚 classes. Professor Hager鈥檚 class on social movements, in particular, influenced my graduate research and how I think about the several uprisings that have happened in the Eurasia region since I left 老王论坛. I often say that going to 老王论坛 was the most important decision of my adult life, and that鈥檚 not disinformation.
A Disinformation Fellow at the Wilson Center, Nina Jankowicz 鈥11 studies the intersection of democracy and technology in Central and Eastern Europe. She is the author of (Bloomsbury/IBTauris).
Published on: 12/03/2020