People Who Jump to Conclusions Show Other Kinds of Thinking Errors - Scientific American
Published by Reblogs - Credits in Posts,
People Who Jump to Conclusions Show Other Kinds of Thinking Errors - Scientific American
People Who Jump to Conclusions Show Other Kinds of Thinking Errors
Belief in conspiracy theories and overconfidence are two tendencies linked to hasty thinking
- By Carmen Sanchez, David Dunning on October 15, 2021
How much time do you spend doing research before you make a decision? The answer for many of us, it turns out, is "hardly any," even with major investments. Most people make two trips or fewer to a dealership before buying a car. And according to survey results in a 2003 paper by economist Katherine Harris, when picking a doctor, many individuals use recommendations from friends and family rather than consulting other health care professionals or "formal sources" such as employers, articles or Web sites.
We are not necessarily conserving our resources to spend them on bigger decisions either. One in five Americans spends more time planning their upcoming vacation than they do their financial future.
To be sure, some people go over every detail exhaustively before making a choice, and it’s certainly possible to overthink things. But there are also people who are quick to jump to conclusions. This way of thinking is considered a cognitive bias, a term psychologists use to describe a tendency toward a specific mental mistake. In this case, the error is making a call based on the sparsest of evidence.
In our own research, we have found that hasty judgments are often just one part of larger error-prone patterns in behavior and thinking. We’ve also found that people who tend to make such "jumps" in their reasoning may experience a wide range of costs.
To study jumping, we worked with more than 600 people from the general population. Because much of the work on this bias comes from studies of schizophrenia (jumping to conclusions is common among people with the condition), we borrowed a thinking game used in that area of research.
In this game, players encountered someone who was fishing from one of two lakes: in one lake, most of the fish were red, and in the other, most were gray. The fisher would catch one fish at a time and stop only when players thought they could say which lake was being fished. Some players had to see many fish before making a decision. Others, the jumpers, stopped after only one or two.
We also asked participants questions to learn more about their other thinking patterns. We found that the fewer fish a player needed to see, the more errors individuals made in other beliefs, reasoning and decisions.
For instance, the earlier a person jumped, the more likely they were to endorse conspiracy theories, such as the idea that the Apollo moon landings had been faked. Such individuals were also more likely to believe in paranormal phenomena and medical myths, such as the idea that health officials are actively hiding a link between cell phones and cancer.
Jumpers made more errors than nonjumpers on problems that require thoughtful analysis. Consider this brainteaser: "A baseball bat and ball cost $1.10 together. The bat cost $1 more than the ball. How much does the ball cost?" Many respondents leaped to the conclusion of 10 cents, but a little thought reveals the right answer to be five cents. (It’s true; think the problem through.)
In a gambling task, people with a tendency to jump were more often lured into choosing inferior bets over those in which they had a better chance of winning. Specifically, jumpers fell into the trap of focusing on the number of times a winning outcome could happen rather than the full range of possible outcomes overall.
Jumpers also had problems with overconfidence: on a quiz about American civics, they overrated the chance that their answers were right more significantly than other participants—even when their answers were wrong.
The differences in decision quality between those who jumped and those who did not remained even after we took intelligence, measured by a test of verbal intellect, and personality differences into account. Our data also suggested the difference was not merely because jumpers rushed through our tasks.
So what is behind jumping? Psychological researchers commonly distinguish between two pathways of thought: One path is automatic. Known as system 1, it reflects ideas that come to the mind easily, spontaneously and without effort. The other path represents controlled thought. Known as system 2, it comprises conscious and effortful reasoning that is analytic, mindful and deliberate.
We used several assessments that teased apart how automatic our participants’ responses were and how much they engaged in deliberate analysis. We found that jumpers and nonjumpers are equally swayed by automatic system 1 thoughts. The jumpers, however, do not engage in controlled system 2 reasoning to the same degree as nonjumpers.
It’s system 2 thinking that helps people correct mental contaminants and other biases introduced by the more knee-jerk system 1. Put another way, jumpers were more likely to accept the conclusions they made at first blush without deliberative examination or questioning. Lack of system 2 thinking also more broadly connected to their problematic beliefs and faulty reasoning.
Happily, there may be some hope for jumpers: Our work suggests that using training to target their biases can help people think more deliberatively. Specifically, we adapted a method called metacognitive training (MCT) from schizophrenia research and created a self-paced online version of the intervention. In this training, participants are confronted with their own biases. For example, as part of our approach, people tackle puzzles, and after they make mistakes related to specific biases, these errors are called out so that the participants can learn about the missteps and other ways of thinking through the problem at hand. This intervention helps chip away at participants’ overconfidence.
We want to continue this work to trace other problems introduced by jumping. Also, we wonder if there are any potential benefits of this bias. In the process, we aim to give back to schizophrenia research. In some studies, as many as two thirds of patients with schizophrenia who express delusions exhibit a jumping bias when solving simple, abstract probability problems in comparison with up to one fifth of the general population.
Schizophrenia is a relatively rare condition, and much about the connection between jumping and judgment issues is not well understood. Our work with general populations could potentially fill this gap in ways that help people with schizophrenia.
In everyday life, the question of whether we should think things through or instead go with our gut is a frequent and important one. What our research and other recent studies show is that sometimes the most important decision can be when you should choose to take time before deciding. Even gathering just a little bit more evidence may help you avoid a major mistake.
ABOUT THE AUTHOR(S)
Carmen Sanchez is an assistant professor at the Gies College of Business at the University of Illinois at Urbana-Champaign. She studies the development of misbeliefs, decision-making and overconfidence.
David Dunning is a social psychologist and a professor of psychology at the University of Michigan. His research focuses on the psychology of human misbelief, particularly false beliefs people hold about themselves.
Recent Articles by David Dunning
Read This Next
Newsletter
Get smart. Sign up for our email newsletter.
Follow us
Scientific american arabic
العربية© 2021 Scientific American, a Division of Springer Nature America, Inc.
All Rights Reserved.
Tags: Psychology, Storytelling, Philosophy, Metaphilosophy, Rhetoric, Zas, Rootbeer