In the summer of 2017, Nina Jankowicz, a twenty-eight-year-old American, was working in Kyiv as a communications adviser to Ukraine’s foreign ministry as part of a yearlong Fulbright fellowship. Jankowicz had an interest in digital diplomacy and in countering disinformation that was matched by a passion for musical theatre: in Washington, D.C., where she lived for several years before moving to Ukraine, she played Sally in “You’re a Good Man, Charlie Brown” and Audrey in “Little Shop of Horrors.”
So when she came across a Facebook page for a White House protest that called on “resistance activists, show-tune lovers, and karaoke fans,” her curiosity was piqued. She later spoke with Ryan Clayton, a progressive organizer involved in the protest. On July 4th, a man dressed in a waistcoat and a tricornered hat kicked things off. “Hear ye, hear ye, citizens,” he said, ringing a bell. “Resist the rule of the treasonous King Donald!” Protesters waving American flags performed musical numbers calling for Trump’s impeachment, including “Do You Hear the People Sing?,” the anthem from “Les Misérables.”
Clayton told Jankowicz that he was impressed with the turnout. He suspected that it had something to do with a last-minute Facebook message from a user named Helen Christopherson, who offered to pitch in cash to buy ads in exchange for administrator access to the event page. “I got like $80 on my ad account so we can reach like 10000 people in DC or so,” the message read. “That would be Massive!” In fact, Christopherson’s ad spend reached as many as fifty-eight thousand people in the D.C. area.
It wasn’t until October of the following year that Jankowicz began to consider how the success of the protest might fit into a broader pattern. As part of congressional inquiries into Russian interference in the 2016 Presidential election, Democrats on the House Intelligence Committee made public a number of ad purchases by the Internet Research Agency, the so-called “troll factory” in St. Petersburg. The I.R.A. was staffed by hundreds of young Russians who carried out social-media campaigns under false identities. “Helen Christopherson” was a Facebook alias used by one of them. In “How to Lose the Information War,” a persuasive new book on disinformation as a geopolitical strategy, Jankowicz writes, “In an entirely unexpected collision of my two great loves, it seemed that Russia had weaponized show tunes.”
The I.R.A. was financed by Yevgeny Prigozhin, a businessman who has prospered by carrying out unsavory tasks that the Kremlin wants done but prefers not to do itself, like hiring Internet trolls or deploying mercenary soldiers. (In the early two-thousands, his catering company hosted official dinners, earning him the nickname Putin’s Chef.) According to the Mueller report, released in April, 2019, I.R.A.-created groups and accounts “reached tens of millions of U.S. persons.” Belting out show tunes in front of the White House was perhaps more comedic than subversive, but it’s a telling example of the I.R.A.’s modus operandi: the troll factory found “authentic, local voices,” as Jankowicz puts it, to further the Russian state’s “goal of fomenting large-scale distrust in government and democracy.”
Since the 2016 election, the spectre of Russia’s online meddling has become amplified by our own anxiety. In “The Folly and the Glory,” Tim Weiner, the author of histories of the C.I.A. and the F.B.I., argues that Russia “deployed the power of social media to transform the politics of the United States.” By way of illustration, Weiner discusses a conspiracy theory, propagated by the I.R.A. in 2015, that U.S. military exercises in Texas that year were part of an Obama Administration plot to confiscate guns in the state. As the meme circulated, the governor of Texas spoke ominously of the exercises; so did Senator Ted Cruz. “The IRA had gotten into the heads of some powerful politicians—and millions of voters,” Weiner writes. He warns that the success of Russia’s stealth and subversion “may determine if America will endure.”
The challenge in making sense of disinformation operations is disentangling intent from impact. Prigozhin’s trolls may have aspired to distort American politics and upend American society, but to what extent did they succeed? The 2016 theft of Democratic National Committee e-mails by Russian military-intelligence hackers, and their subsequent dissemination via WikiLeaks, seem to have had an effect on the electorate, even if that effect is hard to measure. What I.R.A. trolls managed to achieve, however, was more diffuse, and considerably less significant. In 2016, they inflamed hot spots of American discourse, then ran away when the fire began; their priority appeared to be scoring points with bosses and paymasters in Russia as much as influencing actual votes in the United States. Russian disinformation—and the cynical, distorted world view it entrains—is a problem, but the nature of the problem may not be quite what we imagine.
Jankowicz describes the manic hunt for inauthentic online activity as a game of whack-a-troll. Although taking down fake accounts and fact-checking their content is basic online hygiene, the effect can be limited. A 2017 Yale study found that labelling Facebook content “disputed” increased the share of users who judged it to be false by less than four per cent. And, in focussing on the tactics of the aggressors, we may be overlooking our weaknesses as victims. “Unless we mitigate our own political polarization, our own internal issues, we will continue to be an easy target for any malign actor,” Jankowicz writes. When the American public is full of fear, hate, distrust, and exhaustion, it’s not hard for some trolls—whether in St. Petersburg or in the White House—to stir up those emotions into something even more poisonous.
What if, to borrow an old horror-movie trope, the call is coming from inside the house? Not long ago, I spoke with Aric Toler, a researcher at Bellingcat, an investigative outlet that tracks Russian intelligence operations. Bellingcat identified the Russian military unit that provided the anti-aircraft missile launcher that downed Malaysia Airlines Flight 17 over Ukraine, in 2014, and uncovered the identities of the Russian operatives who poisoned Sergei Skripal, a former Russian spy, in 2018. Toler is worried that Americans’ sense of danger has been misdirected. In April, in a Bellingcat column titled “How (Not) to Report on Russian Disinformation,” Toler took issue with a piece in the Times that had compiled a number of examples to show how “Putin has spread misinformation on issues of personal health for more than a decade.” The article devoted several paragraphs to an obscure Web site called the Russophile, which, Toler pointed out, has virtually no audience.
“It’s an issue of scale,” he told me. Russian-produced disinformation certainly exists; this spring, at the outset of the COVID-19 pandemic, Russia-linked social-media accounts promoted a theory that the virus was a bioweapon invented by the U.S. Army in order to damage China. But compared with, say, Fox News pundits like Tucker Carlson and Sean Hannity, let alone Trump himself, the perceived menace of Russian trolls far outweighs their actual reach. How audible, let alone consequential, are Russian efforts to boost claims that mail-in voting leads to fraud when the President regularly blares the thesis at deafening volumes?
“The effect of one Trump press conference or tweet in shaping opinions, even behaviors, can be monumental,” Toler said. In April, after Trump suggested that disinfectant could be injected into the body to treat COVID-19, health officials in several states reported spikes in calls to poison-control hotlines. A single such center in North Texas reported receiving nearly fifty calls about bleach ingestion in the first three weeks of August alone. “The most a few thousand Russian-directed bot accounts might achieve,” Toler added, “is to get a Twitter hashtag trending for a few hours.”
Credit: Source link