The same public opinion survey that was the basis for gauging voters’ likelihood of casting ballots for presidential candidates Donald Trump or Hillary Clinton last year has been serving a much broader purpose.
Most people might recognize the Understanding America Study by the Center for Economic and Social Research at USC for its groundbreaking daily presidential election poll last year, the USC Dornsife/Los Angeles Times Daybreak Poll. But the center may also be lauded for providing a snapshot of how Americans make decisions, their thoughts on questions relevant to social and economic policy, and American health issues.
Since 2014, the panel of participants has nearly tripled to 6,000, and it is on track to reach 10,000 participants by 2020. The team at the USC Dornsife College of Letters, Arts and Sciences also is building a panel of California residents that should number 1,000 by 2020. And the center is taking the lead on the USC Dornsife/LA Times California statewide poll, expanding its complementary partnership with the Unruh Institute of Politics at USC Dornsife.
By setting a high bar for other polls and surveys, the center is, well, making surveys great again. Although media coverage of the Daybreak poll for which the Understanding America Study was the basis was at times critical, pundits and polling experts alike praised the poll for a highly unusual move: all of the data were publicly available, including the methodology.
After the election last year, our release of data was held up as an example for what other pollsters should do.
“After the election last year, our release of data was held up as an example for what other pollsters should do,” said Center for Economic and Social Research Survey Director Jill Darling.
Darling noted that the American Association for Public Opinion Research has a transparency initiative that encourages ethical and transparent conduct for public opinion research, but it stops just short of calling for the release of data sets.
“We go beyond those guidelines, meeting federal government requirements for releasing full data sets paid for by tax dollars for public use. We apply that principle of openness to all of our data collections, even projects such as political polls for the state and national elections, which are entirely financed by Dornsife and not by federal funds,” Darling said.
Participants picked by postal codes
Virtually all Understanding America Survey data have been accessible since the beginning. Researchers interested in analyzing the data simply have to register for access through the survey’s website.
The Understanding America Survey is an internet-based panel, but Center for Economic and Social Research researchers select the participants based on postal codes, so their initial invitation to participate is sent via snail mail, said Tania Gutsche, the center’s managing director.
Darling said that about 14 percent of the potential participants ultimately join the panel — a higher recruitment rate when compared to phone surveys whose recruitment rates are often in the single digits.
The panel of national representatives includes many low-income Americans. If respondents do not have internet service and elect to take part in the study, the Center for Economic and Social Research provides them with an internet-connected tablet.
Once or twice a month, the participants, who are 18 or older, answer questions through a secure login, and their rate of participation is as high as 80 percent.
Over time, panel members provide vast amounts of information. In addition to the typical demographic information collected for most surveys — age, gender, income and household structure — the survey gathers data on participants’ personality, cognitive function, financial literacy, physical and mental health, employment status, assets and liabilities.
Participants’ identities are kept secure and are separated from the data to ensure privacy, researchers said. As a research project, Understanding America Survey processes have been approved by USC’s Internal Review Board that evaluates and oversees research with human subjects.
We make sure that we know more about our panel members than any other survey or poll.
“We make sure that we know more about our panel members than any other survey or poll,” said Arie Kapteyn, executive director of the Center for Economic and Social Research and a public opinion polling expert. “The data on our participants is broader than anyone else has ever collected systematically. We know their psychological profile. We know their cognitive capability. We know about their finances. We know about their health.”
Obtaining such information is a trust-building exercise. Many people have misgivings or outright objections when it comes to sharing some information about themselves. Panel members are free to choose not to provide the information, so establishing trust is important.
“Getting people to consent to take surveys at all is challenging because people are getting more and more wary of giving up their information, and they wonder where their data will go,” Gutsche said. “We spend a lot of time explaining what the data is used for and how we have protocols in place to keep it safe.”
Gutsche noted, though, that the survey has developed a trusting and committed relationship with its respondents. Participants are able to contact the center directly, by phone, email or through their online account.
“If they don’t get surveys, they call and ask: Where is my survey this month?” Gutsche said. “They get used to being part of something. It’s just a matter of getting them to trust us and to send in the first survey.”
Center for Economic and Social Research scientists allow USC researchers, scientists at other universities and global governments to commission queries that are added into existing questionnaires (e.g. the quarterly Omnibus survey), or are conducted as stand-alone studies. Researchers at the USC Schaeffer Center for Health Policy and Economics are among those who have used the survey to study a research topic.
“My colleague Neeraj Sood and I used the UAS to field a survey about Americans’ attitudes toward a particular vaccine for Ebola just at the height of the pandemic and just as concerns began to arise in the United States about the potential spread of the disease,” said economist John Romley, who also is an associate professor at the USC Price School of Public Policy and the USC School of Pharmacy.
“With the UAS, we were able to assess all kinds of things,” Romley said. “Even when vaccines are quite effective, there is a reluctance among a significant portion of the population about using them. We were interested in how concerned people were about catching the disease, what they were willing to do to mitigate their risk and how much they believed the government should be investing in vaccines for this disease.
“The ability to do surveys in real time as events are developing is a real asset of the UAS,” Romley said.
The results showed respondents were more likely to vaccinate if they were in a community with a high-vaccination rate. The study was published in PLOS One in 2016.
Darling said that other researchers or students can buy time on the panel to ask questions that will support their own research. The survey conducts a quarterly Omnibus survey that allows researchers to share overhead fees and reduce cost, which is dependent on sample size and number of questions.
Mirta Galesic, a professor at the Santa Fe Institute, worked last year with another researcher, Wandi Bruine de Bruin of the University of Leeds in the United Kingdom, to collect data through the Understanding America Survey about people’s perceptions of their social circles before and after the 2016 presidential election.
“We asked the participants to estimate the percentage of their social contacts — friends, family and other acquaintances they were in touch with during the last month — who would vote in the upcoming election and who would vote for each of the different candidates,” Galesic said.
She and her collaborator asked the question in three survey waves before the election and one after.
We found that people’s reports of the candidate their friends were most likely to vote for better predicted their own voting behavior than their self-reported preferred candidate.
“We found that it contributed important additional information to election predictions over and above the standard question about one’s self-reported voting intentions,” Galesic said. “For example, we found that people’s reports of the candidate their friends were most likely to vote for better predicted their own voting behavior than their self-reported preferred candidate.”
Galesic said that she hopes the methodology — asking people about their friends and not only about their personal opinions — can be used by scientists to study other topics, such as attitudes toward health and science.
Data from the Understanding America Survey can be combined with other investigative efforts for studies. For example, Donna Spruijt-Metz, director of the mHealth Collaboratory at the Center for Economic and Social Research, and her team are testing mobile health devices to measure mood and eating behaviors rather than focusing on dietary intake. The researchers aim to develop a real-time intervention that could curb unhealthy behaviors and put a dent in the nation’s obesity problem.
Center for Economic and Social Research scientists are now working with Understanding America Survey respondents, current and future, to compile even more information about participants, including access to their real-time electronic spending information through their credit card and bank accounts, to track their spending and saving habits.