Moral rhetoric on Twitter may signal whether a protest will turn violent, according to a USC-led study.
The USC researchers also found that people are more likely to endorse violence when they moralize the issue that they are protesting — that is, when they see it as an issue of right and wrong. That holds true when they believe that others in their social network moralize the issue, too.
“Extreme movements can emerge through social networks,” said the study’s corresponding author, Morteza Dehghani, a researcher at the Brain and Creativity Institute at USC. “We have seen several examples in recent years, such as the protests in Baltimore and Charlottesville, where people’s perceptions are influenced by the activity in their social networks. People identify others who share their beliefs and interpret this as consensus. In these studies, we show that this can have potentially dangerous consequences.”
The scientists analyzed 18 million tweets posted during the 2015 Baltimore protests over the death of 25-year-old Freddie Gray, who died as police took him to jail. Researchers used a deep neural network — an advanced machine learning technique — to detect moralized language on Twitter.
They investigated the association between moral tweets and arrest rates, a proxy for violence. This analysis showed that the number of hourly arrests made during the protests was associated with the number of moralized tweets posted in previous hours.
Tweets containing moral rhetoric nearly doubled on days when clashes among protesters and police became violent.
The study was published on May 23 in Nature Human Behaviour.
Social media posts as a barometer for activism
Social media sites such as Twitter have become a significant platform for activism and a source for data on human behavior. That makes them ripe for research.
Recent examples of movements tied to social media include the #marchforourlives effort to seek gun control, the #metoo movement against sexual assault and harassment, and #blacklivesmatter, a campaign against systematic racism that began in 2014 after the police-involved shooting death of Michael Brown, 19, in Ferguson, Mo.
An example involving more violence is the Arab Spring revolution, which began in Tunisia in late 2010 and set off protests in Egypt, Libya and other nations, forcing changes in their leadership. In Syria, clashes escalated into a war that has killed hundreds of thousands of people and displaced countless refugees.
Detecting moralization online
The scientists developed a model for detecting moralized language based on a prior, deep learning framework that can reliably identify text that evokes moral concerns associated with different types of moral values and their opposites. The “Moral Foundations Theory” defines these dueling values:
Here are two examples moralized language and the moral foundations with which they are associated:
Sample Tweet 1:
Moral Foundations: Fairness and Loyalty
Sample Tweet 2:
regardless of how anyone feels, prayers to the police force and their family
Moral Foundations: Care and Purity
Moralization and political polarization are exacerbated by online “echo chambers,” researchers say. These are social networks where people connect with other like-minded people while distancing themselves from those who don’t share their beliefs.
Protests, social media and violence
Social media data help researchers illuminate real-world social dynamics and test hypotheses, explained Joe Hoover, a lead author of the paper and doctoral candidate in psychology at the USC Dornsife College of Letters, Arts and Sciences. “However, as with all observational data, it can be difficult to establish the statistical and experimental control that is necessary for drawing reliable conclusions.”
To make up for this, the scientists conducted a series of controlled behavioral studies, each with more than 200 people. Researchers first asked participants to read a paragraph about the 2017 clashes over the removal of Confederate monuments in Charlottesville, Va. Then the researchers asked how much participants agreed or disagreed with statements about the use of violence against far-right protesters.
The more certain people were that many others in their network shared their views, the more willing they were to consider the use of violence against their perceived opponents, the scientists found.
The work was supported by a grant from the U.S. Department of Defense. Other study co-authors were Marlon Mooijman of Northwestern University, Hoover from USC Dornsife and the Brain and Creativity Institute at USC, and Ying Lin and Heng Ji of the Rensselaer Polytechnic Institute.