There may be a psychological reason why some people are not just wrong in an argument – they are wrong in trust.
According to a study published Wednesday in PLoS One magazine, believe that you have all the information necessary to form an opinion, even if you don't.
“Our brains are overconfident that they can come to a reasonable conclusion with very little information,” said Angus Fletcher, an English professor at Ohio State University who co-authored the study.
Fletcher, together with two psychology researchers, decided to measure the extent to which people make judgments about situations or people based on their confidence in the information they have – even if it's not the whole story.
“People are very quick to judge,” he said.
The researchers recruited about 1,300 people over the age of 40. They each read a fictional story about a school that was running out of water because the local reservoir had dried up.
About 500 people read a version of the story that argued for the school to merge with another school, offering three arguments and one neutral point in support of the move.
Another 500 people read a story with three arguments in favor of secession and the same neutral point.
The 300 finalists, the control group, read a balanced story that included seven arguments – three pro-merger, three pro-merger and one neutral.
After reading, researchers asked participants for their opinions on what the school should do and how confident they were that they had all the information needed to make that judgment.
Surveys revealed that most people were more likely to agree with the argument – whether in favor of unification or in favor of remaining separate – that they read, and were often confident that they had enough information to support that opinion. People in the group who read only one perspective were also more likely to say they were more confident in their opinions than those in the control group who read both arguments.
Half of the participants in each group were then asked to read opposing information, which contradicted the article they had already read.
Although people were confident in their opinions when they read arguments for just one solution, when presented with all the information, they were often willing to change their minds. They also reported that they were less confident in their ability to form an opinion on the issue.
“We thought that people would actually stick to their original judgments even when they receive information that contradicts those judgments, but it turns out that if they learn something that seems plausible to them, they are willing to completely change their minds,” Fletcher said. Fletcher added. Research that highlights the idea that people do not consider whether they have all the information about a situation.
However, the researchers noted that the findings may not apply to situations where people have preconceived notions about a situation, as is often the case in politics.
“People are more open-minded and willing to change their minds than we think,” Fletcher said. However, “this same flexibility does not apply to long-standing differences, such as political beliefs.”
Todd Rogers, a behavioral scientist at Harvard's Kennedy School of Government, compared these findings to the “Invisible Gorilla” Study, which illustrates the psychological phenomenon of “scientific blindness,” when a person fails to notice something obvious because they are focused on something else.
“This study captures the data,” Rogers said. “There seems to be a cognitive tendency to not realize that the information we have is insufficient.”
The research also parallels a psychological phenomenon called the “illusion of explanatory depth” in which people underestimate what they know about a given topic, said Barry Schwartz, a psychologist and professor emeritus of social theory and social work at Swarthmore College in New York. Pennsylvania.
The idea is that if you ask the average person if they know how a toilet works, they'll probably say yes. But when asked to explain how the toilet works, they quickly realize they don't know how the toilet works, just how to press a lever.
“It’s not just that people are wrong. The problem is that they are so confident in their mistakes,” Schwartz said.
The antidote, he added, is “to be curious and humble.”
Those who participated in the study and who were subsequently presented with new information were open to changing their minds as long as the new information seemed plausible, encouraging and surprising, the researchers and Schwartz agreed.
“It’s cause for some optimism that even if people think they know something, they are still open to having their minds changed by new evidence,” Schwartz said.