A new computer algorithm can predict whether you and your spouse will have a stronger or weaker relationship based on the tone of voice used when speaking to each other.
In fact, the algorithm did a better job of predicting marital success of couples with serious marital issues than descriptions of the therapy sessions provided by relationship experts. The research was published in Proceedings of Interspeech on Sept. 6.
Researchers recorded hundreds of conversations from more than 100 couples taken during marriage therapy sessions over two years and then tracked their marital status for five years.
An interdisciplinary team — led by Shrikanth Narayanan and Panayiotis Georgiou of the USC Viterbi School of Engineering with doctoral student Md Nasir and collaborator Brian Baucom of the University of Utah — then developed an algorithm that broke the recordings into acoustic features using speech-processing techniques such as pitch, intensity, “jitter” and “shimmer,” along with tracking warbles in the voice that can indicate moments of high emotion.
What you say is not the only thing that matters; it’s very important how you say it.
“What you say is not the only thing that matters; it’s very important how you say it,” Nasir said. “Our study confirms that it holds for a couple’s relationship as well.”
Taken together, the vocal acoustic features offered the team’s program a proxy for the subject’s communicative state and the changes to that state over the course of a single therapy or across multiple therapy sessions.
Those features weren’t analyzed in isolation — rather, the impact of one partner upon the other over multiple therapy sessions was studied.
“It’s not just about studying your emotions,” Narayanan said. “It’s about studying the impact of what your partner says on your emotions.”
Added Georgiou: “Looking at one instance of a couple’s behavior limits our observational power. However, looking at multiple points in time and looking at both the individuals and the dynamics of the dyad can help identify trajectories of their relationship.”
Once it was fine-tuned, the program was then tested against behavioral analyses made by human experts who had coded them for positive qualities like “acceptance” or negative qualities like “blame.” The team found that studying voice directly — rather than the expert-created behavioral codes — offered a more accurate glimpse at a couple’s future.
“Psychological practitioners and researchers have long known that the way that partners talk about and discuss problems has important implications for the health of their relationships,” Baucom said. “However, the lack of efficient and reliable tools for measuring the important elements in those conversations has been a major impediment in their widespread clinical use. These findings represent a major step forward in making objective measurement of behavior practical and feasible for couple therapists.”
Using behavioral signal processing — a framework developed by Narayanan for computationally understanding human behavior — the team next plans to use language (e.g., spoken words) and nonverbal information (body language) to improve the prediction of how effective treatments will be.
The National Science Foundation supported the research.
More stories about: Research