In today’s Information Age, it’s easy to get overwhelmed by online content. On YouTube alone, more than 100 hours worth of video is uploaded every minute. Though a high volume of content conveys a level of quality to its audiences and encourages users to stay on the website, this influx of information can make it difficult for both content providers and users to determine what is interesting and worth consuming.
To explore user experience, a recent study determined what kind of content users prefer and then evaluated how position on a webpage affects collective judgments about content. Published in the peer-reviewed online journal PLOS ONE, study co-authors Kristina Lerman, a computer science professor at the USC Viterbi School of Engineering and a project leader at USC Viterbi’s Information Sciences Institute, and Tad Hogg, a research fellow at the Institute for Molecular Manufacturing in Palo Alto, California, evaluated some popular peer recommendation strategies and their ability to identify interesting content.
Lerman and Hogg found that consumers spend five times more attention on material posted near the top of a webpage.
The power of positioning
“Psychologists have known for decades that position bias affects perception: People pay more attention to items at the top of a list than those below them,” Lerman said. “We were surprised, however, how strongly this affected user behavior and the outcomes of recommendation.”
Lerman and Hogg found that position bias accounts for consumers spending five times more attention on material posted near the top of a webpage. Due to position bias, users are more likely to see, consume and recommend already-popular content positioned near the top of the webpage, creating a herding effect that amplifies already popular posts at the expense of potentially more interesting content farther down the webpage.
Sites such as Reddit depend on user ratings to sort and rank their most interesting content. But this so-called collective intelligence may be biased and inconsistent. In practice, peer recommendation often leads to “irrational herding” behaviors that result in similar content receiving widely different numbers and types of recommendations.
Ranking by chronology, not popularity
In their study, Lerman and Hogg demonstrated that ordering content by how recently it was recommended rather than by total “likes” generates better estimates of what users actually find interesting and would prefer to consume.
Twitter does the right thing when it pushes newly retweeted posts to the top of the followers’ screens, giving them another chance to discover interesting content.
Twitter avoids the “irrational herding” effects by presenting content in chronological order, based on the time of recommendation. Retweets, or recommendations, bring older posts back up to the top of a user’s newsfeed, helping to reduce the herding effect.
“Twitter does the right thing when it pushes newly retweeted posts to the top of the followers’ screens, giving them another chance to discover interesting content,” Lerman said.
Position bias can create a cycle that excludes quality content, but by understanding the factors that influence peer recommendation, providers can still use the power of collective judgment to help readers find content worthy of their time and attention.
The research was funded in part by the Air Force Office of Scientific Research (contract FA9550-10-1-0569) and by the Defense Advanced Research Projects Agency (contract W911NF-12-1-0034).