What Makes Intelligent People Prone To Bias?

Everyone has their biases — that’s plainly just a fact of life. But you might presume that more intelligent people are better equipped to take note of their own leanings and to analyze information in a more neutral, clearheaded way. On the surface that might feel like a reasonable presumption to make, but, in actual fact, numerous studies have shown that smarter people are more prone to allowing their biases to influence their outlook. Why is that?

Prone to bias

When a person is faced with a complicated problem, there tends to be a range of potential ways of tackling and solving it. In the best-case scenario, the individual will apply critical thinking to the situation, weighing up their options in a balanced manner and, ultimately, establishing the best course of action.

That’s not always what happens. To the contrary, a range of studies have suggested that, really, people are prone to seriously contemplating only the options that they’re already predisposed to favor in the first place.

The “myside” bias

This tendency — which has been termed the “myside” bias — can be observed in the way people act in real-life situations all the time. On top of that, researchers have noted it over and over again in a range of different studies.

It’s hardly groundbreaking to learn that people act on their existing beliefs. But what is a little surprising is the suggestion that smarter individuals appear to be more prone to doing it than most.

A real-life example

Let’s consider a real-life situation. The death rate associated with COVID-19 is substantially higher than that of regular flu. That’s an established fact, backed up by reams of research. The specifics are difficult to establish, given the changeable nature of the COVID-19 situation, but the scientific community has shown that COVID-related deaths are, as Johns Hopkins Medicine puts it, “substantially higher (possibly ten times or more) than that of most strains of the flu.”

This statement is demonstrably true: there’s plenty of evidence for it. Yet many people don’t buy it, insisting instead that regular flu and COVID-19 are just as deadly as each other. Within this group of people who buy into the idea are plenty of individuals of above-average intelligence.

Explaining away biases

Research has indicated smarter people are especially likely to interpret situations in a biased way. In her book THINKING 101, Yale University psychology professor Woo-kyoung Ahn, suggests why, noting it’s “because they know more ways to explain away the facts that contradict their beliefs.”

She went on to reference a groundbreaking study on this subject, which took place in 1979. It illustrated confirmation bias in action, but it also showed how subjects undertook “elaborate and intelligent efforts” in order to “maintain their bias.” It’s a fascinating read.

The capital punishment study

For the study, a group of undergrads were selected on the basis of their outlook on capital punishment. Some of them supported it, because they thought it served to act as a deterrent against committing crimes, while others opposed it.

All the subjects were given the results of ten studies related to the death penalty. In reality, these studies weren’t genuine, but they ostensibly focused on the question of whether or not the death penalty served to discourage crime.

Research for and against

Half of the made-up studies demonstrated that the death penalty did discourage crime. One of them read, “Kroner and Phillips (1977) compared murder rates for the year before and the year after adoption of capital punishment in 14 states. In 11 of the 14 states, murder rates were lower after adoption of the death penalty. This research supports the deterrent effect of the death penalty.”

The other half claimed the death penalty didn’t have a deterrent effect on crime. For example, “Palmer and Crandall (1977) compared murder rates in ten pairs of neighboring states with different capital punishment laws. In eight of the ten pairs, murder rates were higher in the state with capital punishment. This research opposes the deterrent effect of the death penalty.”

Were they influenced?

So, each time the subjects had read through one of these made-up death penalty studies, they were told to record whether or not their outlook on capital punishment had changed. Had their view of the issue been influenced by the research they’d just analyzed?

You might presume that, no, people didn’t change their minds. Those who already supported capital punishment continued to support it, and vice versa. Well, that’s actually not how things played out.

Both sides were influenced

It turned out that both sides of the group were influenced by the materials they’d read. When they read the studies that claimed the deterrent effect was real, both the proponents and the opponents of the death penalty came to take a more favorable view of it. On the other hand, when they read the studies arguing against the deterrent effect, both sides tended to become less convinced by it.

In other words, opinions on both sides were shaped by the information to which they had been exposed. Of course, it’s not like their original outlooks had no bearing on the situation at all. If, for instance, backers of the death penalty read a study supportive of the deterrence effect, they became more enthusiastic about it than the opponents did. But changes in outlook were observed on both sides.

Part two

Things get even more interesting when we look at a second part of this study. For the first part, the subjects read fairly short summaries of the findings of each made-up study. In this second part, they were provided with much more information. The methods of the made-up studies were described in greater detail, for example.

The effect of the added information upon the subject’s perspective on capital punishment was profound. Why? Well, Dr. Ahn suggested in THINKING 101, it was “because they provided these smart participants with excuses to dismiss the evidence when the results contradicted their original beliefs.”

A flawed methodology?

Let’s take a look at an example of one of the subject’s responses to a particular study, having received the additional information. Having read that the study had been conducted “only one year before and one year after capital punishment was reinstated” in a particular state, they’d decided that the methodology was flawed.

“To be a more effective study,” this person said, “they should have taken data from at least ten years before and as many years as possible after. There were too many flaws in the picking of the states and too many variables involved in the experiment as a whole to change my opinion.”

Elaborate ways of supporting their bias

This is a pretty detailed critique, indicative of someone fairly intelligent. But the thing is, those who were found to formulate these apparently sophisticated arguments were also shown to be particularly biased. They’d come up with arguments like that one, and they’d use them against the studies that opposed their original stance on capital punishment. And on top of that, the studies that contradicted their original stance would actually reinforce their initial belief.

Dr. Ahn explained, “Supporters of capital punishment became even more positive… after having read the details about the studies that undermine the deterrent effects... Similarly, opponents of capital punishment became even more negative about it after having read details about the studies that supported the deterrent effects.”

Good analytic thinking skills

Dr. Ahn elaborated on the implications of these findings. She pointed out, “Coming up with excuses to dismiss evidence requires a good amount of analytic thinking skills and background knowledge, like how to collect and analyze data… When the participants could not apply such sophisticated skills because the study descriptions were so brief, biased assimilation did not occur.”

“But once they had enough information, they could use those skills to find fault with the studies that contradicted their original position, to such a point that findings that were at odds with their beliefs ended up strengthening them.”

The findings of the capital-punishment study were fascinating, but the research had its limitations. For one thing, the reasoning skills of each subject weren’t assessed on an individual basis. The researchers, therefore, couldn’t compare how people of varying cognitive abilities fared against one another.

Other research, though, has endeavored to do just that. For example, in one particular study, the people who took part were assessed for their numeracy skills. A series of questions were asked of them, some more complicated than others.

Throwing a five-sided die

On the easier side, this question was included. “Imagine,” it began, “we are throwing a five-sided die 50 times. On average, out of these 50 throws, how many times would this five-sided die show an odd number?” Can you get the correct answer?

It’s 30. That’s because a five-sided die has five possible outcomes: 1, 2, 3, 4, or 5. Of those, there are three odd numbers. The probability of an odd number being rolled, therefore, is three in five. And three-fifths of 50 is 30.

Picking mushrooms

If you want an example of one of the more complicated questions, here’s the set-up, “In a forest, 20 percent of mushrooms are red, 50 percent brown, and 30 percent white. A red mushroom is poisonous with a probability of 20 percent. A mushroom that is not red is poisonous with a probability of 5 percent.”

So, bearing all that information in mind, participants were asked, “What is the probability that a poisonous mushroom in the forest is red?” The answer to that one is 50 percent.

The rash cream

Subjects in this study were also shown a data table, which laid out the apparent effects of a skin cream on different people’s rashes. According to the data, about 75 percent of those who applied the cream claimed it helped to clear their rash up. The remaining 25 percent of people who applied the cream to their rash said it made it worse.

On the other hand, the data revealed that 84 percent of respondents who didn’t use the cream also noted that their rash improved — so the data would suggest that people were better off not using the cream. But many people tend to focus on the 75 percent that claimed the cream helped, and assume the data therefore points to the cream’s effectiveness, even though it actually doesn’t.

A challenging task

In this section of the study, in which participants were asked to consider the data about the rash cream, Dr. Ahn wrote, “Assessing these results correctly is a fairly challenging task, so it makes sense that the higher the participants’ scores were in the numeracy assessment, the more likely they were to get the right answers. And in fact that’s what happened.”

Interestingly, Dr. Ahn pointed out that participants’ political leanings — that is, whether or not they voted Democrat or Republican — weren’t relevant to their analysis of the data. That might seem like a point that’s come out of nowhere, but it was specifically relevant to another part of the study.

Political leanings

In this other section, the exact same numbers that were used in the rash cream data were applied to a much more politically divisive scenario. That is, the data this time was about whether or not gun control had an effect on the rate of crime.

As Dr. Ahn explained, “Two versions of this data were presented: one showed that gun control increased crime, supporting the view held by a majority of Republicans, and the other showed that gun control decreased crime, supporting the view more common among Democrats.”

No bias in those with low numeracy skills

This is where things got really interesting. Those participants who’d scored comparatively poorly on the numeracy tests tended to have trouble getting the right answers to the questions about the rash cream and gun control. That’s not terribly surprising, of course. And it also didn’t matter whether they voted Democrat or Republican.

Regardless of their political allegiances, if the participants struggled with numeracy in the first place, they were less likely to answer the questions about the rash cream and gun control correctly. But, notably, they didn’t exhibit any biases when assessing the data.

Higher numeracy skills equates to greater bias

That wasn’t at all the case when it came to the participants who exhibited greater numeracy skills. “Republicans with higher numeracy were more likely to get it right when the correct answer was that gun control increased crime,” Dr. Ahn explained.

“Democrats with higher numeracy were more likely to get it right when the correct answer was that gun control decreased crime. That is, people with stronger quantitative reasoning abilities used them only when the data supported their existing views.” (Our italics).

Everyone does it

Dr. Ahn was adamant that she wasn’t trying to claim that people with lower cognitive abilities don’t also exhibit their own biases. “I am not trying to say those lacking high levels of quantitative or analytic reasoning skills don’t make biased interpretations,” she wrote. “Of course they do.”

“It is highly unlikely that only ‘smart’ people make, for example, snap race-based judgments about whether someone is holding a gun or a cell phone. The point here is that so-called smart skills do not free people from irrational biases. Sometimes they can exacerbate the biases.”

The work of Daniel Kahneman

And other experts have been willing to make the still-stronger claim that smarter people really do tend to exhibit biases more than others. The late Daniel Kahneman, a Princeton University psychologist and Nobel Prize winner who passed away on March 27, 2024, was one such person.

Throughout his career, Dr. Kahneman had wanted to challenge assumptions about intelligent people being less prone to bias. To do so, he’d pose a series of questions to people and consider their responses.

The ball and the bat

An example of the sort of question Kahneman would ask of people was laid out in an article in The New Yorker. “A bat and ball cost $1.10,” the problem begins. “The bat costs $1 more than the ball. How much does the ball cost?”

The question seems easy: the ball costs ten cents, right? Most people say that, and they say it swiftly. But it’s wrong. The real answer is that the ball costs five cents, while the bat is $1.05. It’s obvious when you think about it properly.

Challenging assumptions

For hundreds of years, countless social scientists, philosophers, and economists have worked off the assumption that humans are rational creatures. One hears the argument all the time, but, frankly, the evidence would appear to suggest that quite the opposite is really the case.

Dr. Kahneman, not to mention other experts such as Shane Frederick — who was responsible for coming up with that question about the bat and the ball’s price — and the now-departed Amos Tversky, worked towards making that fact plain. People are prone to irrationality.

Mental shortcuts

The argument that people are actually prone to irrationality says that, when faced with a complicated, uncertain scenario, individuals tend not to think through the information available to them very carefully. Nor, for that matter, do they seek out more information or statistics that might actually shed light on the situation.

On the contrary, they undertake a series of “mental shortcuts” that require less effort than thinking carefully through a situation. As in the bat-and-ball pricing exercise, wanting to avoid deep consideration, people quickly jump to a conclusion which seems obvious but is actually wrong.

The psychology of stupidity

Dr. Kahneman’s insight on this subject has been profoundly influential, so much so that The New Yorker claimed he “is now widely recognized as one of the most influential psychologists of the 20th century.” Still, that wasn’t always so. During the earlier stages of his career, his ideas were rejected.

In a conversation with The Guardian in 2011 the boffin himself told a story about one “well-known American philosopher” sniffing at the sort of work he was doing. “I am not really interested,” this philosopher had supposedly said, “in the psychology of stupidity.”

An ironic position

The irony, of course, is that this unnamed snobby philosopher is the one who now looks stupid. A lot of research now backs up the thinking of Dr. Kahneman, including one study that has appeared in the Journal of Personality and Social Psychology.

This piece of work was headed up by Richard West of James Madison University and Keith Stanovich of the University of Toronto. Together, they laid out a convincing argument of why, contrary to widespread expectations, intelligence can often equate to increased bias.

Classic bias problems

There were 482 undergraduates taking part in Dr. West and Dr. Stanovich’s study, each of whom was asked a series of questions. These consisted of what The New Yorker termed as “classic bias problems.” Let’s look at one to get an idea of what sort of problems were put to the respondents.  

“In a lake, there is a patch of lily pads,” the problem begins. “Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?”

A wrong answer

Do you know the answer? A lot of people are prone to taking a mental shortcut when faced with this problem, which means they divide 48 by 2 to get 24. They therefore posit that the answer is that it takes 24 days for the pond to become half-covered. They’re wrong.

The real answer is that it takes 47 days. That might initially seem wild at first, but think about it carefully and the penny will drop. It’s entirely obvious, once you see it.

Anchoring bias

DR. West also included a problem to test for “anchoring bias,” which is a concept that Dr. Kahneman and Dr. Tversky described in the ’70s. The New Yorker piece laid out the problem, noting, “Subjects were first asked if the tallest redwood tree in the world was more than X feet, with X ranging from 85 to 1,000 feet. Then the students were asked to estimate the height of the tallest redwood tree in the world.

“Students exposed to a small “anchor” — like 85 feet — guessed, on average, that the tallest tree in the world was only 118 feet. Given an anchor of 1,000 feet, their estimates increased seven-fold.”

Disturbing findings

All of this is very interesting, but Dr. West, dr. Stanovich, and other people they worked with wanted to learn more than this. It was all well and good to analyze the various biases that people were prone to, but that work had already been done many times before. They wanted to specifically work out how a person’s intelligence influenced those biases.

They consequently made sure to take note of participants’ intelligence, as measured by tests such as the SAT and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.” Their findings, as The New Yorker noted, were “quite disturbing.”

Self-awareness isn’t helpful

The researchers observed that “people who were aware of their own biases were not better able to overcome them.” To the layman, that might seem like a really surprising finding: surely self-awareness would help counter cognitive bias? Well, apparently not.  

Dr. Kahneman himself once observed this fact in himself. In his 2011 book Thinking, Fast and Slow, he wrote, “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.”

The bias blind spot

One of the more troublesome biases to which people are prone, according to Dr. West, is what he called the “bias blind spot,” which is when we automatically assume other people are more prone to making thinking errors than we are. The New Yorker piece explains, “This ‘meta-bias’ is rooted in our ability to spot systematic mistakes in the decisions of others — we excel at noticing the flaws of friends — and inability to spot those same mistakes in ourselves.”

“Although the bias blind spot itself isn’t a new concept… [Dr.] West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called ‘framing effects.’ In each instance, we readily forgive our own minds but look harshly upon the minds of other people.”

A not-so-good education?

And, it seems, intelligence makes people more prone to succumbing to the bias blind spot. According to Dr. West’s research, intelligent individuals — intelligent in so much as they score highly in the SAT — are a little more likely than others to fall into flawed, biased ways of thinking.

It’s worth noting that Dr. Kahneman and Dr. Frederick once observed that a “good” education doesn’t necessarily produce unbiased thinkers. In fact, they found that more than half of the M.I.T., Princeton, and Harvard students to whom they posed the ball-and-bat question got it wrong!

Evaluating others

The New Yorker piece offered some theories as to why this might be. “One provocative hypothesis,” it suggested, “is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors.”

“However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.”

Introspection compounds the error

There’s a major problem with this introspective manner of analysis, from the perspective of this theory. Our biases, which are the main drivers of irrational thinking, tend to be unconscious. When we attempt to reflect upon our own thoughts, feelings, and behaviors, then, we tend to miss these unconscious biases.

“In fact,” the New Yorker piece argues, “introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.”

Believing in fairies

All of this research stands in opposition to assumptions that lots of people hold about the nature of intelligence and how it affects those who possess it. Smart people, it’s presumed, are able to recognise their biases and rise above them. We think they’re best able to assess information from a neutral position and to consequently work out the best solution to a problem.

That, of course, may not actually be the truth. There are countless examples of otherwise demonstrably smart people succumbing to biases in strange ways. Sir Arthur Conan Doyle, for example, believed in fairies. That very well may have had something to do with the fact he was grieving the loss of his son, after his death in war.

A predisposition for conformity

People we might traditionally think of as intelligent — say, doctors, academics, and scientists — can often exhibit a predisposition for conformity. To be sure, that can be a great thing in certain instances. After all, would you want a doctor who didn’t play by the rules to treat you?

But in certain situations, it can mean such intellectuals tend to follow the crowd, without thinking critically about something. There’s an incentive for them to do so, as their careers often stand to benefit from going with the flow.

Only an intellectual…

Everyone has their biases, regardless of the ostensible level of their intelligence. Human beings just tend to understand the world in ways that align with the beliefs that they already hold. It’s universal, but it really does seem like the situation is even more pronounced in the cases of smart people.

The situation calls to mind a quotation that has been attributed variously to both author George Orwell and esteemed thinker Bertrand Russell. Regardless of its provenance, it seems fitting here. It runs, “There are some ideas so absurd that only an intellectual could believe them.”