How bland positive messages help Russian trolls spread disinformation
- When we read examples of fake news headlines from the 2016 election, they seem blatantly false.
- However, the data shows that most Russian trolls were mostly sharing posts meant to camouflage their actions, with a small percentage of posts sharing fake headlines.
- As the 2020 elections approach, researchers are discovering that Russian trolls are becoming more sophisticated and savvy in how they spread disinformation.
All of us like to think that we would know when people are manipulating us online. Reading headlines such as “Pope Francis shocks world, endorses Donald Trump for president” or “ISIS leader calls for American Muslim voters to support Hillary Clinton” seem like obvious fabrications (they are).
Especially now, after the 2016 election’s revelations have come to light, we all like to think that we’re smarter and more prepared to question blatantly fake news. But researchers suggest that we might not actually have a great handle on what Russian trolling looks like in the lead up to the 2020 election.
Image source: Wikimedia Commons
The Internet Research Agency’s new approach
The Internet Research Agency (IRA) first formed in 2013, employing thousands of employees whose sole job was to write up fake blog articles, craft polarizing comments, and share disinformation through social media accounts. We might think of the IRA as primarily issuing the blatantly fake headlines described above, that only the very ignorant could fall for this kind of propaganda, but in reality, the IRA’s employees function more like savvy marketers than they do Orwellian propagandists — though there’s admittedly less of a difference in this case than we might like.
In a recent article for Rolling Stone, researchers Darren Linvill and Patrick Warren described their knowledge of the on-going disinformation campaign being waged by the IRA. As researchers into state-sponsored disinformation and its influence, Linvill and Warren have been keeping their fingers on the pulse of the post-2016 internet trolls. They write:
“Professional trolls are good at their job. They have studied us. They understand how to harness our biases (and hashtags) for their own purposes. They know what pressure points to push and how best to drive us to distrust our neighbors. The professionals know you catch more flies with honey. They don’t go to social media looking for a fight; they go looking for new best friends. And they have found them.”
Linvill and Warren offer examples of modern troll posts — typically, they aren’t stirring up conflict between black or blue lives matter or claiming that Hillary Clinton has been running a pedophilia ring in the basement of a pizza parlor. Instead, they’re posting tweets like one from a fictional “Tyra Jackson” celebrating former football player Warrick Dunn’s charity work, a tweet that garnered nearly 290,000 likes.
In Linvill and Warren’s previous research, they broke down the kinds of posts that Russian trolls made into separate categories. The most common by far were what they termed “camouflaging” posts, or posts that had no overt political connection, which accounted for more than half of the troll accounts’ activity. They were related to local news stories, uplifting posts like the one made by Tyra Jackson, blandly positive messages such as “Start each day with a grateful heart #GoodMorning #happywednesday,” and similarly disarming, normal subjects.
This camouflaging posts made it all the more convincing when they did sow disinformation. Linvill and Warren found that, at least in the lead up to the 2016 election, troll accounts were most likely to support the right and attack the left. This finding makes intuitive sense; conservative forces did win the 2016 election, after all, so it seems reasonable that Russian trolls were going at-bat for the right. Importantly, however, a significant portion of Russian troll activity landed on the left-side of the political spectrum. In Linvill and Warren’s research, they found that 12 percent and 7 percent of Russian troll posts attacked the left and supported the right, respectively, but 5.4 percent and 7.4 percent attacked the right and supported the left as well.
It can be tempting to conclude that Russia’s goal is to support conservative politicians given the results of the 2016 election, but the data suggests that their true goal is merely to widen our political divide in a party-agnostic way. Consider a tweet from the fictional twitter account @politeMelanie that Warren and Linvill uncovered:
“My cousin is studying sociology in university. Last week she and her classmates polled over 1,000 conservative Christians. ‘What would you do if you discovered that your child was a homo sapiens?’ 55% said they would disown them and force them to leave their home.”
This is a completely made-up anecdote, but were a left-leaning individual to come across it, they might accept it as true since it reinforces stereotypes regarding the intolerance and ignorance of religious and political groups. We might like to think that we wouldn’t fall for this, but the 300,000 people who liked it probably took it at face value.
The idea that divisiveness itself is the goal of Russian trolls is better seen in which politicians they’ve attacked or defended in the past. For example, once the Republican primaries started in 2016, the IRA’s instructions to its employees was to “use any opportunity to criticize Hillary and the rest (except Sanders and Trump — we support them).”
We might chalk this strategic decision up to Putin’s well-known personal animosity with Hillary Clinton, but we can see a similar dynamic emerging in the current democratic primaries. Middle-of-the-road candidates such as Joe Biden are frequently the target of Russian trolls, while the most polarizing candidates such as Donald Trump and Sanders enjoy greater support. The goal is to amplify the differences between opposite ends of the political spectrum until staying true to one’s party makes more sense than staying true to one’s country.
The IRA knows that this goal can’t be accomplished through a direct, brutish disinformation campaign; instead, it takes a subtle touch that they are consistently perfecting.