The Washington PostDemocracy Dies in Darkness

Spreading fake news becomes standard practice for governments across the world

July 17, 2017 at 12:00 p.m. EDT
Marta Sevilla for The Washington Post.

Campaigns to manipulate public opinion through false or misleading social media postings have become standard political practice across much of the world, with information ministries, specialized military units and political operatives shaping the flow of information in dozens of countries, a British research group reported Monday.

These propaganda efforts exploit every social media platform — Facebook, Twitter, Instagram and beyond — and rely on human users and computerized “bots” that can dramatically amplify the power of disinformation campaigns by automating the process of preparing and delivering posts. Bots interact with human users and also with other bots.

Though most social media platforms are designed and run by corporations based the United States, the platforms are infiltrated almost immediately upon their release to the public by a range of international actors skilled at using information to advance political agendas, within their own countries and beyond, said the researchers from Oxford University’s Computational Propaganda Research Project.

“The government propaganda evolved with social media and has grown along with it,” said Philip N. Howard, an Oxford professor and co-author of the report, called “Troops, Trolls and Troublemakers: A Global Inventory of Organized Social Media Manipulation.”

The report draws on news accounts of social media propaganda in 29 countries to reach broader conclusions about the global growth of various techniques, including issuing false news reports, attacking journalists or countering critical social media posts with messages supporting a government position or political view.

These efforts are often, though not always, clandestine, with the origin of the social media posts obscured through phony account information. Bots often play key roles by automatically creating social media posts, responding to other users and echoing select themes in a way that are difficult to distinguish from ordinary human users. Bots can post far more often than human users, in some cases more than 1,000 times a day; human users dubbed “cyborgs” rely on similar automation technology to bolster the power of their accounts as well.

As a conservative Twitter user sleeps, his account is hard at work

Twitter and Facebook, which owns Instagram, declined to comment on the report. Neither company was singled out in the report, though Twitter and Facebook have become popular targets for social media manipulation because of their global reach.

Howard said he and the report’s other lead researcher, Oxford’s Samantha Bradshaw, were struck by how much of the propaganda activity and innovation happened in Western-style democracies, including Britain, the United States, Israel, Australia and Mexico.

The report, citing a previously published news account, said that Israel had 350 social media accounts on multiple platforms, operating in English, Hebrew and Arabic. A British propaganda campaign posts fake videos on YouTube in an attempt to prevent Muslims from becoming radicalized and joining the war in Syria, the report said. And political forces in Mexico used bots and human users to attack journalists and spread disinformation over social media.

In some cases, these efforts involved full-blown government bureaucracies, with a steady number of employees and fixed payrolls. Other times bands of online activists or ad hoc groups of paid workers worked together for a single campaign before being disbanded. Some efforts also get outsourced to private vendors that specialize in influencing opinion through social media.

Though Russia leads the world in the sophistication of its online propaganda efforts, Howard said that efforts to support Republican Donald Trump in the 2016 presidential campaign broke ground for using social media to shape political opinion. Howard’s group and others have previously reported that Twitter bots supporting Trump were far more vocal and organized than bots supporting Democrat Hillary Clinton, particularly in the closing days of the election.

“It’s the presidential election cycles that put tens of millions of dollars into these innovations,” Howard said. “The big-money innovations happen in the United States and then get adopted everywhere.”

Other researchers have documented the power of social media to bolster Trump’s surprise electoral success and shown that some of those social media resources are spreading to other nations.

The spread of unflattering documents about French presidential candidate Emmanuel Macron — now debunked as phony — got key support in the final days of the May election from Twitter bots that also had supported Trump in the U.S., according to Emilio Ferrara, a researcher at the University of Southern California.

He analyzed 17 million tweets, finding that bots based outside of France focused on different issues than human Twitter users in France. His latest report, published this month, suggested the possibility of “a black-market for reusable political disinformation bots.”

The use of these techniques is growing rapidly as bots and other techniques for manipulating opinion on social media become cheaper and easier to use, and as evidence grows of their effectiveness. Many companies now sell bot accounts by the thousands and, for a fee, will manage them for customers, Ferrara said.

He and other researchers said that the social media platforms do not do enough to combat the spread of bots and the resulting propaganda. The impact goes beyond electoral politics to hot-button issues such as climate change and the safety of vaccines.

“The vast majority of people … they would be surprised at the extent to which these platforms are used for political manipulation,” Ferrara said. “Especially with nobody doing anything about it.”

Howard’s group also has detected bots that supported Trump working on other issues globally, often in concert with bots supporting alt-right causes and Russian propaganda campaigns.

“They generate so much content — and they share each other’s content — that it’s hard to disaggregate the networks,” Howard said.