University professors create new software to detect lies
What does lying look like? Two University researchers believe they have found the answer.
Using video footage of high-profile court trials and hearings, Rada Mihalcea, professor of computer science and electrical engineering, and Mihai Burzo, assistant professor of mechanical engineering at UM-Flint, are leading a project to build lie-detecting software that can gauge deception through a speaker’s words and gestures.
Mihalcea said the research team first began developing the software by studying 120 video clips from media coverage of court trials. She said the team chose to analyze videos as part of building the software to add a real-life element to the software.
“I think that one of the challenges in doing the research on deception is that usually you get your data in lab, which is fine, but I think that people who come to the lab to contribute data do not have those high stakes settings that you would have in a court setting, like if you were defending your life,” she said. “So that’s the reason why we wanted real data when people are truly lying.”
The videos include testimony from both defendants and witnesses. By comparing the testimony to the court’s verdict, the researchers deemed which subjects were being deceptive. Through the comparison, researchers found half of the clips featured deceptive subjects.
Mihalcea said the team also used some clips from The Innocence Project, a national organization that works to reexamine cases where individuals were tried without the benefit of DNA testing with the aim of exonerating wrongfully convicted individuals.
“The Innocence Project is all about people being exonerated, so they have people who eventually tell the truth though they were believed to tell a lie,” Mihalcea said. “So in our data, there would be data points with that truth label. That data would show people who were actually saying the truth.”
After the team gauged deception by comparing the court’s verdict with the testimony, the researchers analyzed body language in the clips, specifically looking for common behaviors in subjects deemed deceptive.
Mihalcea said the team referred to common deceptive gestures identified in prior research when analyzing the subjects.
According to a press release, the researchers found that 70 percent of the deceptive subjects looked directly at the questioner when answering questions. They also found that 40 percent of the deceptive subjects often gestured with both hands.
Mihalcea said after identifying common gestures, researchers then transcribed the audio from the video clips of trials and analyzed how often subjects labeled deceptive used various words and phrases.
“We extracted individual words and groups of words,” she said. “For instance, there is a group of words that reflect positive feelings, or there is a group of words that would reflect certainty. We looked for the presence of words that would belong to these categories.”
By analyzing the speaker's common gestures and words together, the researchers then determined which overall behaviors were typical for deceptive subjects.
After feeding the results of the study into the software system, Mihalcea said the rubric of deceptive behaviors was 75-percent accurate in identifying which subjects were deceptive among the 120 videos originally studied.
Though Mihalcea said the research team considered the software successful in the prototype stage, she noted there were still improvements to be made — in particular, enhancing the formula to consider cultural and demographic differences.
Burzo wrote in a press release that examining cultural and demographic differences adds a different perspective in deception research.
"Deception detection is a very difficult problem," he said. "We are getting at it from several different angles."
Beyond the courtroom, Mihalcea said, there might also be other applications to which the software can be adapted, such as job interviews.
“It could be helpful in situations where there is an interaction with people and it is important to know when the other side is telling the truth,” she said. “I think a system like this would give clues, or at least give an indication that there is a chance that this person is lying, which, whatever the context would be, humans could make use of.”