Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

  • Last updated
  • Save as PDF
  • Page ID 162135

  • Nathan Smith et al.

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

Learning Objectives

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.


See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Click to view content

Confirmation Bias

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk Cost Fallacy

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Table 2.1 Common Cognitive Biases

Think Like A Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

Cognitive Bias: How We Are Wired to Misjudge

Charlotte Ruhl

Research Assistant & Psychology Graduate

BA (Hons) Psychology, Harvard University

Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Have you ever been so busy talking on the phone that you don’t notice the light has turned green and it is your turn to cross the street?

Have you ever shouted, “I knew that was going to happen!” after your favorite baseball team gave up a huge lead in the ninth inning and lost?

Or have you ever found yourself only reading news stories that further support your opinion?

These are just a few of the many instances of cognitive bias that we experience every day of our lives. But before we dive into these different biases, let’s backtrack first and define what bias is.

Cognitive Bias and Judgement Error - Systematic Mental Pattern of Deviation from Norm or Rationality in Judgement - Conceptual Illustration

What is Cognitive Bias?

Cognitive bias is a systematic error in thinking, affecting how we process information, perceive others, and make decisions. It can lead to irrational thoughts or judgments and is often based on our perceptions, memories, or individual and societal beliefs.

Biases are unconscious and automatic processes designed to make decision-making quicker and more efficient. Cognitive biases can be caused by many things, such as heuristics (mental shortcuts) , social pressures, and emotions.

Broadly speaking, bias is a tendency to lean in favor of or against a person, group, idea, or thing, usually in an unfair way. Biases are natural — they are a product of human nature — and they don’t simply exist in a vacuum or in our minds — they affect the way we make decisions and act.

In psychology, there are two main branches of biases: conscious and unconscious. Conscious or explicit bias is intentional — you are aware of your attitudes and the behaviors resulting from them (Lang, 2019).

Explicit bias can be good because it helps provide you with a sense of identity and can lead you to make good decisions (for example, being biased towards healthy foods).

However, these biases can often be dangerous when they take the form of conscious stereotyping.

On the other hand, unconscious bias , or cognitive bias, represents a set of unintentional biases — you are unaware of your attitudes and behaviors resulting from them (Lang, 2019).

Cognitive bias is often a result of your brain’s attempt to simplify information processing — we receive roughly 11 million bits of information per second. Still, we can only process about 40 bits of information per second (Orzan et al., 2012).

Therefore, we often rely on mental shortcuts (called heuristics) to help make sense of the world with relative speed. As such, these errors tend to arise from problems related to thinking: memory, attention, and other mental mistakes.

Cognitive biases can be beneficial because they do not require much mental effort and can allow you to make decisions relatively quickly, but like conscious biases, unconscious biases can also take the form of harmful prejudice that serves to hurt an individual or a group.

Although it may feel like there has been a recent rise of unconscious bias, especially in the context of police brutality and the Black Lives Matter movement, this is not a new phenomenon.

Thanks to Tversky and Kahneman (and several other psychologists who have paved the way), we now have an existing dictionary of our cognitive biases.

Again, these biases occur as an attempt to simplify the complex world and make information processing faster and easier. This section will dive into some of the most common forms of cognitive bias.

Cognitive biases as systematic error in thinking and behavior outline diagram. Psychological mindset feeling with non logic judgment effects vector illustration.

Confirmation Bias

Confirmation bias is the tendency to interpret new information as confirmation of your preexisting beliefs and opinions while giving disproportionately less consideration to alternative possibilities.

Real-World Examples

Since Watson’s 1960 experiment, real-world examples of confirmation bias have gained attention.

This bias often seeps into the research world when psychologists selectively interpret data or ignore unfavorable data to produce results that support their initial hypothesis.

Confirmation bias is also incredibly pervasive on the internet, particularly with social media. We tend to read online news articles that support our beliefs and fail to seek out sources that challenge them.

Various social media platforms, such as Facebook, help reinforce our confirmation bias by feeding us stories that we are likely to agree with – further pushing us down these echo chambers of political polarization.

Some examples of confirmation bias are especially harmful, specifically in the context of the law. For example, a detective may identify a suspect early in an investigation, seek out confirming evidence, and downplay falsifying evidence.


The confirmation bias dates back to 1960 when Peter Wason challenged participants to identify a rule applying to triples of numbers.

People were first told that the sequences 2, 4, 6 fit the rule, and they then had to generate triples of their own and were told whether that sequence fits the rule. The rule was simple: any ascending sequence.

But not only did participants have an unusually difficult time realizing this and instead devised overly-complicated hypotheses, they also only generated triples that confirmed their preexisting hypothesis (Wason, 1960).


But why does confirmation bias occur? It’s partially due to the effect of desire on our beliefs. In other words, certain desired conclusions (ones that support our beliefs) are more likely to be processed by the brain and labeled as true (Nickerson, 1998).

This motivational explanation is often coupled with a more cognitive theory.

The cognitive explanation argues that because our minds can only focus on one thing at a time, it is hard to parallel process (see information processing for more information) alternate hypotheses, so, as a result, we only process the information that aligns with our beliefs (Nickerson, 1998).

Another theory explains confirmation bias as a way of enhancing and protecting our self-esteem.

As with the self-serving bias (see more below), our minds choose to reinforce our preexisting ideas because being right helps preserve our sense of self-esteem, which is important for feeling secure in the world and maintaining positive relationships (Casad, 2019).

Although confirmation bias has obvious consequences, you can still work towards overcoming it by being open-minded and willing to look at situations from a different perspective than you might be used to (Luippold et al., 2015).

Even though this bias is unconscious, training your mind to become more flexible in its thought patterns will help mitigate the effects of this bias.

Hindsight Bias

Hindsight bias refers to the tendency to perceive past events as more predictable than they actually were (Roese & Vohs, 2012). There are cognitive and motivational explanations for why we ascribe so much certainty to knowing the outcome of an event only once the event is completed.

 Hindsight Bias Example

When sports fans know the outcome of a game, they often question certain decisions coaches make that they otherwise would not have questioned or second-guessed.

And fans are also quick to remark that they knew their team was going to win or lose, but, of course, they only make this statement after their team actually did win or lose.

Although research studies have demonstrated that the hindsight bias isn’t necessarily mitigated by pure recognition of the bias (Pohl & Hell, 1996).

You can still make a conscious effort to remind yourself that you can’t predict the future and motivate yourself to consider alternate explanations.

It’s important to do all we can to reduce this bias because when we are overly confident about our ability to predict outcomes, we might make future risky decisions that could have potentially dangerous outcomes.

Building on Tversky and Kahneman’s growing list of heuristics, researchers Baruch Fischhoff and Ruth Beyth-Marom (1975) were the first to directly investigate the hindsight bias in the empirical setting.

The team asked participants to judge the likelihood of several different outcomes of former U.S. president Richard Nixon’s visit to Beijing and Moscow.

After Nixon returned back to the States, participants were asked to recall the likelihood of each outcome they had initially assigned.

Fischhoff and Beyth found that for events that actually occurred, participants greatly overestimated the initial likelihood they assigned to those events.

That same year, Fischhoff (1975) introduced a new method for testing the hindsight bias – one that researchers still use today.

Participants are given a short story with four possible outcomes, and they are told that one is true. When they are then asked to assign the likelihood of each specific outcome, they regularly assign a higher likelihood to whichever outcome they have been told is true, regardless of how likely it actually is.

But hindsight bias does not only exist in artificial settings. In 1993, Dorothee Dietrich and Matthew Olsen asked college students to predict how the U.S. Senate would vote on the confirmation of Supreme Court nominee Clarence Thomas.

Before the vote, 58% of participants predicted that he would be confirmed, but after his actual confirmation, 78% of students said that they thought he would be approved – a prime example of the hindsight bias. And this form of bias extends beyond the research world.

From the cognitive perspective, hindsight bias may result from distortions of memories of what we knew or believed to know before an event occurred (Inman, 2016).

It is easier to recall information that is consistent with our current knowledge, so our memories become warped in a way that agrees with what actually did happen.

Motivational explanations of the hindsight bias point to the fact that we are motivated to live in a predictable world (Inman, 2016).

When surprising outcomes arise, our expectations are violated, and we may experience negative reactions as a result. Thus, we rely on the hindsight bias to avoid these adverse responses to certain unanticipated events and reassure ourselves that we actually did know what was going to happen.

Self-Serving Bias

Self-serving bias is the tendency to take personal responsibility for positive outcomes and blame external factors for negative outcomes.

You would be right to ask how this is similar to the fundamental attribution error (Ross, 1977), which identifies our tendency to overemphasize internal factors for other people’s behavior while attributing external factors to our own.

The distinction is that the self-serving bias is concerned with valence. That is, how good or bad an event or situation is. And it is also only concerned with events for which you are the actor.

In other words, if a driver cuts in front of you as the light turns green, the fundamental attribution error might cause you to think that they are a bad person and not consider the possibility that they were late for work.

On the other hand, the self-serving bias is exercised when you are the actor. In this example, you would be the driver cutting in front of the other car, which you would tell yourself is because you are late (an external attribution to a negative event) as opposed to it being because you are a bad person.

From sports to the workplace, self-serving bias is incredibly common. For example, athletes are quick to take responsibility for personal wins, attributing their successes to their hard work and mental toughness, but point to external factors, such as unfair calls or bad weather, when they lose (Allen et al., 2020).

In the workplace, people attribute internal factors when they have hired for a job but external factors when they are fired (Furnham, 1982). And in the office itself, workplace conflicts are given external attributions, and successes, whether a persuasive presentation or a promotion, are awarded internal explanations (Walther & Bazarova, 2007).

Additionally, self-serving bias is more prevalent in individualistic cultures , which place emphasis on self-esteem levels and individual goals, and it is less prevalent among individuals with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.

Overcoming this bias can be difficult because it is at the expense of our self-esteem. Nevertheless, practicing self-compassion – treating yourself with kindness even when you fall short or fail – can help reduce the self-serving bias (Neff, 2003).

The leading explanation for the self-serving bias is that it is a way of protecting our self-esteem (similar to one of the explanations for the confirmation bias).

We are quick to take credit for positive outcomes and divert the blame for negative ones to boost and preserve our individual ego, which is necessary for confidence and healthy relationships with others (Heider, 1982).

Another theory argues that self-serving bias occurs when surprising events arise. When certain outcomes run counter to our expectations, we ascribe external factors, but when outcomes are in line with our expectations, we attribute internal factors (Miller & Ross, 1975).

An extension of this theory asserts that we are naturally optimistic, so negative outcomes come as a surprise and receive external attributions as a result.

Anchoring Bias

individualistic cultures is closely related to the decision-making process. It occurs when we rely too heavily on either pre-existing information or the first piece of information (the anchor) when making a decision.

For example, if you first see a T-shirt that costs $1,000 and then see a second one that costs $100, you’re more likely to see the second shirt as cheap as you would if the first shirt you saw was $120. Here, the price of the first shirt influences how you view the second.

 Anchoring Bias Example

Sarah is looking to buy a used car. The first dealership she visits has a used sedan listed for $19,000. Sarah takes this initial listing price as an anchor and uses it to evaluate prices at other dealerships.

When she sees another similar used sedan priced at $18,000, that price seems like a good bargain compared to the $19,000 anchor price she saw first, even though the actual market value is closer to $16,000.

When Sarah finds a comparable used sedan priced at $15,500, she continues perceiving that price as cheap compared to her anchored reference price.

Ultimately, Sarah purchases the $18,000 sedan, overlooking that all of the prices seemed like bargains only in relation to the initial high anchor price.

The key elements that demonstrate anchoring bias here are:

  • Sarah establishes an initial reference price based on the first listing she sees ($19k)
  • She uses that initial price as her comparison/anchor for evaluating subsequent prices
  • This biases her perception of the market value of the cars she looks at after the initial anchor is set
  • She makes a purchase decision aligned with her anchored expectations rather than a more objective market value

Multiple theories seek to explain the existence of this bias.

One theory, known as anchoring and adjustment, argues that once an anchor is established, people insufficiently adjust away from it to arrive at their final answer, and so their final guess or decision is closer to the anchor than it otherwise would have been (Tversky & Kahneman, 1992).

And when people experience a greater cognitive load (the amount of information the working memory can hold at any given time; for example, a difficult decision as opposed to an easy one), they are more susceptible to the effects of anchoring.

Another theory, selective accessibility, holds that although we assume that the anchor is not a suitable answer (or a suitable price going back to the initial example) when we evaluate the second stimulus (or second shirt), we look for ways in which it is similar or different to the anchor (the price being way different), resulting in the anchoring effect (Mussweiler & Strack, 1999).

A final theory posits that providing an anchor changes someone’s attitudes to be more favorable to the anchor, which then biases future answers to have similar characteristics as the initial anchor.

Although there are many different theories for why we experience anchoring bias, they all agree that it affects our decisions in real ways (Wegner et al., 2001).

The first study that brought this bias to light was during one of Tversky and Kahneman’s (1974) initial experiments. They asked participants to compute the product of numbers 1-8 in five seconds, either as 1x2x3… or 8x7x6…

Participants did not have enough time to calculate the answer, so they had to estimate based on their first few calculations.

They found that those who computed the small multiplications first (i.e., 1x2x3…) gave a median estimate of 512, but those who computed the larger multiplications first gave a median estimate of 2,250 (although the actual answer is 40,320).

This demonstrates how the initial few calculations influenced the participant’s final answer.

Availability Bias

Availability bias (also commonly referred to as the availability heuristic ) refers to the tendency to think that examples of things that readily come to mind are more common than what is actually the case.

In other words, information that comes to mind faster influences the decisions we make about the future. And just like with the hindsight bias, this bias is related to an error of memory.

But instead of being a memory fabrication, it is an overemphasis on a certain memory.

In the workplace, if someone is being considered for a promotion but their boss recalls one bad thing that happened years ago but left a lasting impression, that one event might have an outsized influence on the final decision.

Another common example is buying lottery tickets because the lifestyle and benefits of winning are more readily available in mind (and the potential emotions associated with winning or seeing other people win) than the complex probability calculation of actually winning the lottery (Cherry, 2019).

A final common example that is used to demonstrate the availability heuristic describes how seeing several television shows or news reports about shark attacks (or anything that is sensationalized by the news, such as serial killers or plane crashes) might make you think that this incident is relatively common even though it is not at all.

Regardless, this thinking might make you less inclined to go in the water the next time you go to the beach (Cherry, 2019).

As with most cognitive biases, the best way to overcome them is by recognizing the bias and being more cognizant of your thoughts and decisions.

And because we fall victim to this bias when our brain relies on quick mental shortcuts in order to save time, slowing down our thinking and decision-making process is a crucial step to mitigating the effects of the availability heuristic.

Researchers think this bias occurs because the brain is constantly trying to minimize the effort necessary to make decisions, and so we rely on certain memories – ones that we can recall more easily – instead of having to endure the complicated task of calculating statistical probabilities.

Two main types of memories are easier to recall: 1) those that more closely align with the way we see the world and 2) those that evoke more emotion and leave a more lasting impression.

This first type of memory was identified in 1973, when Tversky and Kahneman, our cognitive bias pioneers, conducted a study in which they asked participants if more words begin with the letter K or if more words have K as their third letter.

Although many more words have K as their third letter, 70% of participants said that more words begin with K because the ability to recall this is not only easier, but it more closely aligns with the way they see the world (knowing the first letter of any word is infinitely more common than the third letter of any word).

In terms of the second type of memory, the same duo ran an experiment in 1983, 10 years later, where half the participants were asked to guess the likelihood of a massive flood would occur somewhere in North America, and the other half had to guess the likelihood of a flood occurring due to an earthquake in California.

Although the latter is much less likely, participants still said that this would be much more common because they could recall specific, emotionally charged events of earthquakes hitting California, largely due to the news coverage they receive.

Together, these studies highlight how memories that are easier to recall greatly influence our judgments and perceptions about future events.

Inattentional Blindness

A final popular form of cognitive bias is inattentional blindness . This occurs when a person fails to notice a stimulus that is in plain sight because their attention is directed elsewhere.

For example, while driving a car, you might be so focused on the road ahead of you that you completely fail to notice a car swerve into your lane of traffic.

Because your attention is directed elsewhere, you aren’t able to react in time, potentially leading to a car accident. Experiencing inattentional blindness has its obvious consequences (as illustrated by this example), but, like all biases, it is not impossible to overcome.

Many theories seek to explain why we experience this form of cognitive bias. In reality, it is probably some combination of these explanations.

Conspicuity holds that certain sensory stimuli (such as bright colors) and cognitive stimuli (such as something familiar) are more likely to be processed, and so stimuli that don’t fit into one of these two categories might be missed.

The mental workload theory describes how when we focus a lot of our brain’s mental energy on one stimulus, we are using up our cognitive resources and won’t be able to process another stimulus simultaneously.

Similarly, some psychologists explain how we attend to different stimuli with varying levels of attentional capacity, which might affect our ability to process multiple stimuli simultaneously.

In other words, an experienced driver might be able to see that car swerve into the lane because they are using fewer mental resources to drive, whereas a beginner driver might be using more resources to focus on the road ahead and unable to process that car swerving in.

A final explanation argues that because our attentional and processing resources are limited, our brain dedicates them to what fits into our schemas or our cognitive representations of the world (Cherry, 2020).

Thus, when an unexpected stimulus comes into our line of sight, we might not be able to process it on the conscious level. The following example illustrates how this might happen.

The most famous study to demonstrate the inattentional blindness phenomenon is the invisible gorilla study (Most et al., 2001). This experiment asked participants to watch a video of two groups passing a basketball and count how many times the white team passed the ball.

Participants are able to accurately report the number of passes, but what they fail to notice is a gorilla walking directly through the middle of the circle.

Because this would not be expected, and because our brain is using up its resources to count the number of passes, we completely fail to process something right before our eyes.

A real-world example of inattentional blindness occurred in 1995 when Boston police officer Kenny Conley was chasing a suspect and ran by a group of officers who were mistakenly holding down an undercover cop.

Conley was convicted of perjury and obstruction of justice because he supposedly saw the fight between the undercover cop and the other officers and lied about it to protect the officers, but he stood by his word that he really hadn’t seen it (due to inattentional blindness) and was ultimately exonerated (Pickel, 2015).

The key to overcoming inattentional blindness is to maximize your attention by avoiding distractions such as checking your phone. And it is also important to pay attention to what other people might not notice (if you are that driver, don’t always assume that others can see you).

By working on expanding your attention and minimizing unnecessary distractions that will use up your mental resources, you can work towards overcoming this bias.

Preventing Cognitive Bias

As we know, recognizing these biases is the first step to overcoming them. But there are other small strategies we can follow in order to train our unconscious mind to think in different ways.

From strengthening our memory and minimizing distractions to slowing down our decision-making and improving our reasoning skills, we can work towards overcoming these cognitive biases.

An individual can evaluate his or her own thought process, also known as metacognition (“thinking about thinking”), which provides an opportunity to combat bias (Flavell, 1979).

This multifactorial process involves (Croskerry, 2003):

(a) acknowledging the limitations of memory, (b) seeking perspective while making decisions, (c) being able to self-critique, (d) choosing strategies to prevent cognitive error.

Many strategies used to avoid bias that we describe are also known as cognitive forcing strategies, which are mental tools used to force unbiased decision-making.

The History of Cognitive Bias

The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used this phrase to describe people’s flawed thinking patterns in response to judgment and decision problems (Tversky & Kahneman, 1974).

Tversky and Kahneman’s research program, the heuristics and biases program, investigated how people make decisions given limited resources (for example, limited time to decide which food to eat or limited information to decide which house to buy).

As a result of these limited resources, people are forced to rely on heuristics or quick mental shortcuts to help make their decisions.

Tversky and Kahneman wanted to understand the biases associated with this judgment and decision-making process.

To do so, the two researchers relied on a research paradigm that presented participants with some type of reasoning problem with a computed normative answer (they used probability theory and statistics to compute the expected answer).

Participants’ responses were then compared with the predetermined solution to reveal the systematic deviations in the mind.

After running several experiments with countless reasoning problems, the researchers were able to identify numerous norm violations that result when our minds rely on these cognitive biases to make decisions and judgments (Wilke & Mata, 2012).

Key Takeaways

  • Cognitive biases are unconscious errors in thinking that arise from problems related to memory, attention, and other mental mistakes.
  • These biases result from our brain’s efforts to simplify the incredibly complex world in which we live.
  • Confirmation bias , hindsight bias, mere exposure effect , self-serving bias , base rate fallacy , anchoring bias , availability bias , the framing effect ,  inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect .
  • Cognitive biases directly affect our safety, interactions with others, and how we make judgments and decisions in our daily lives.
  • Although these biases are unconscious, there are small steps we can take to train our minds to adopt a new pattern of thinking and mitigate the effects of these biases.

Allen, M. S., Robson, D. A., Martin, L. J., & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport. Personality and Social Psychology Bulletin, 46 (7), 1027-1043.

Casad, B. (2019). Confirmation bias . Retrieved from

Cherry, K. (2019). How the availability heuristic affects your decision-making . Retrieved from

Cherry, K. (2020). Inattentional blindness can cause you to miss things in front of you . Retrieved from

Dietrich, D., & Olson, M. (1993). A demonstration of hindsight bias using the Thomas confirmation vote. Psychological Reports, 72 (2), 377-378.

Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1 (3), 288.

Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once—future things. Organizational Behavior and Human Performance, 13 (1), 1-16.

Furnham, A. (1982). Explanations for unemployment in Britain. European Journal of social psychology, 12(4), 335-352.

Heider, F. (1982). The psychology of interpersonal relations . Psychology Press.

Inman, M. (2016). Hindsight bias . Retrieved from

Lang, R. (2019). What is the difference between conscious and unconscious bias? : Faqs. Retrieved from

Luippold, B., Perreault, S., & Wainberg, J. (2015). Auditor’s pitfall: Five ways to overcome confirmation bias . Retrieved from

Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in the self-serving attributional bias. Psychological Bulletin, 130 (5), 711.

Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction?. Psychological Bulletin, 82 (2), 213.

Most, S. B., Simons, D. J., Scholl, B. J., Jimenez, R., Clifford, E., & Chabris, C. F. (2001). How not to be seen: The contribution of similarity and selective ignoring to sustained inattentional blindness. Psychological Science, 12 (1), 9-17.

Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35 (2), 136-164.

Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude toward oneself. Self and Identity, 2 (2), 85-101.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175-220.

Orzan, G., Zara, I. A., & Purcarea, V. L. (2012). Neuromarketing techniques in pharmaceutical drugs advertising. A discussion and agenda for future research. Journal of Medicine and Life, 5 (4), 428.

Pickel, K. L. (2015). Eyewitness memory. The handbook of attention , 485-502.

Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information and repeated testing. Organizational Behavior and Human Decision Processes, 67 (1), 49-58.

Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7 (5), 411-426.

Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In Advances in experimental social psychology (Vol. 10, pp. 173-220). Academic Press.

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5 (2), 207-232.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124-1131.

Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review , 90(4), 293.

Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4), 297-323.

Walther, J. B., & Bazarova, N. N. (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner blame. Human Communication Research, 33 (1), 1-26.

Wason, Peter C. (1960), “On the failure to eliminate hypotheses in a conceptual task”. Quarterly Journal of Experimental Psychology, 12 (3): 129–40.

Wegener, D. T., Petty, R. E., Detweiler-Bedell, B. T., & Jarvis, W. B. G. (2001). Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology, 37 (1), 62-69.

Wilke, A., & Mata, R. (2012). Cognitive bias. In Encyclopedia of human behavior (pp. 531-535). Academic Press.

Further Information

Test yourself for bias.

  • Project Implicit (IAT Test) From Harvard University
  • Implicit Association Test From the Social Psychology Network
  • Test Yourself for Hidden Bias From Teaching Tolerance
  • How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes transcript
  • Understanding Your Racial Biases With John Dovidio, PhD, Yale University From the American Psychological Association11:09 minutes; includes transcript
  • Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
  • Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
  • Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
  • Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
  • Impact of Implicit Bias on the Justice System 9:14 minutes
  • Students Speak Up: What Bias Means to Them 2:17 minutes
  • Weight Bias in Health Care From Yale University16:56 minutes
  • Gender and Racial Bias In Facial Recognition Technology 4:43 minutes

Journal Articles

  • An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
  • Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
  • Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American journal of public health, 105 (12), e60-e76.
  • Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
  • Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
  • Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
  • Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.

Print Friendly, PDF & Email

Related Articles

Automatic Processing in Psychology: Definition & Examples

Cognitive Psychology

Automatic Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

How Ego Depletion Can Drain Your Willpower

How Ego Depletion Can Drain Your Willpower

What is the Default Mode Network?

What is the Default Mode Network?

Theories of Selective Attention in Psychology

Availability Heuristic and Decision Making

Availability Heuristic and Decision Making

Bias and Critical Thinking

Note: The German version of this entry can be found here: Bias and Critical Thinking (German)

Note: This entry revolves more generally around Bias in science. For more thoughts on Bias and its relation to statistics, please refer to the entry on Bias in statistics .

In short: This entry discusses why science is never objective, and what we can really know.

  • 1 What is bias?
  • 2 Design criteria
  • 3 Bias in gathering data, analysing data and interpreting data
  • 4 Bias and philosophy
  • 5 Critical Theory and Bias
  • 6 Further Information

What is bias?

"The very concept of objective truth is fading out of the world." - George Orwell

A bias is “the action of supporting or opposing a particular person or thing in an unfair way, because of allowing personal opinions to influence your judgment” (Cambridge Dictionary). In other words, bias clouds our judgment and often action in the sense that we act wrongly. We are all biased, because we are individuals with individual experiences, and are unconnected from other individuals and/or groups, or at least think we are unconnected.

Recognising bias in research is highly relevant, because bias exposes the myth of objectivity of research and enables a better recognition and reflection of our flaws and errors. In addition, one could add that understanding bias in science is relevant beyond the empirical, since bias can also highlight flaws in our perceptions and actions as humans. To this end, acknowledging bias is understanding the limitations of oneself. Prominent examples are gender bias and racial bias, which are often rooted in our societies, and can be deeply buried in our subconscious. To be critical researchers it is our responsibility to learn about the diverse biases we have, yet it is beyond this text to explore the subjective human bias we need to overcome. Just so much about the ethics of bias: many would argue that overcoming our biases requires the ability to learn and question our privileges. Within research we need to recognise that science has been severely and continuously biased against ethnic minorities, women, and many other groups. Institutional and systemic bias are part of the current reality of the system, and we need to do our utmost to change this - there is a need for debiasing science, and our own actions. While it should not go unnoticed that institutions and systems did already change, injustices and inequalities still exist. Most research is conducted in the global north, posing a neo-colonialistic problem that we are far from solving. Much of academia is still far away from having a diverse understanding of people, and systemic and institutional discrimination are parts of our daily reality. We are on the path of a very long journey, and there is much to be done concerning bias in constructed institutions.

All this being said, let us shift our attention now to bias in empirical research. Here, we show three different perspective in order to enable a more reflexive understanding of bias. The first is the understanding how different forms of biases relate to design criteria of scientific methods. The second is the question which stage in the application of methods - data gathering, data analysis, and interpretation of results - is affected by which bias, and how. Finally, the third approach is to look at the three principal theories of Western philosophy, namely reason, social contract and utilitarianism - to try and dismantle which of the three can be related to which bias. Many methods are influenced by bias, and recognising which bias affects which design criteria, research stage and principal philosophical theory in the application of a method can help to make empirical research more reflexive.

does individual critical thinking hold any bias

Design criteria

While qualitative research is often considered prone to many biases, it is also often more reflexive in recognising its limitations. Many qualitative methods are defined by a strong subjective component - i.e. of the researcher - and a clear documentation can thus help to make an existing bias more transparent. Many quantitative approaches have a reflexive canon that focuses on specific biases relevant for a specific approach, such as sampling bias or reporting bias. These are often less considered than in qualitative methods, since quantitative methods are still – falsely - considered to be more objective. This is not true. While one could argue that the goal of reproducibility may lead to a better taming of a bias, this is not necessarily so, as the crisis in psychology clearly shows. Both quantitative and qualitative methods are potentially strongly affected by several cognitive biases, as well as by bias in academia in general, which includes for instance funding bias or the preference of open access articles. While all this is not surprising, it is still all the much harder to solve.

Another general differentiation can be made between inductive and deductive approaches. Many deductive approaches are affected by bias that is associated to sampling. Inductive approaches are more associated to bias during interpretation. Deductive approaches often build around designed experiments, while the strongpoint of inductive approaches is being less bound by methodological designs, which can also make bias more hidden and thus harder to detect. However, this is why qualitative approaches often have an emphasis on a concise documentation.

The connection between spatial scales and bias is rather straightforward, since the individual focus is related to cognitive bias, while system scales are more associated to prejudices, bias in academia and statistical bias. While the impact of temporal bias is less explored, forecast bias is a prominent example when it comes to future predictions, and another error is applying our cultural views and values on past humans, which has yet to be clearly named as a bias. What can be clearly said about both spatial and temporal scales is that we are often irrationally biased towards very distant entities - in space or time - and even irrationally more than we should be. We are for instance inclined to reject the importance of a distant future scenario, although it may widely follow the same odds to become a reality than a close future. For example, almost everybody would like to win the lottery tomorrow rather than win the lottery in 20 years, irrespective of your chances to live and see it happen, or the longer time you may spend with your lottery prize for the (longer) time to come. Humans are most peculiar constructed beings, and we are notorious to act irrationally. This is equally true for spatial distance. We may care irrationally more for people that are close to us as compared to people that are very distant, even independent of joined experience (e.g with friends) or joined history (e.g. with family). Again, this infers a bias which we can be aware of, but which has to be named. No doubt the current social developments will increase our capacities to recognise our biases even more, as all these phenomena also affect scientists.

The following table categorizes different types of Bias as indicated in the Wikipedia entry on Bias according to two levels of the Design Criteria of Methods .

Bias in gathering data, analysing data and interpreting data

The three steps of the application of a method are clearly worth investigating, as it allows us to dismantle at which stage we may inflict a bias into our application of a method. Gathering data is strongly associated with cognitive bias, yet also to statistical bias and partly even to some bias in academia. Bias associated to sampling can be linked to a subjective perspective as well as to systematic errors rooted in previous results. This can also affect the analysis of data, yet here one has to highlight that quantitative methods are less affected by a bias through analysis than qualitative methods. This is not a normative judgement, and can clearly be counter-measured by a sound documentation of the analytical steps. We should nevertheless not forget that there are even different assumptions about the steps of analysis in such an established field as statistics. Here, different schools of thought constantly clash regarding the optimal approach of analysis, sometimes even with different results. This exemplifies that methodological analysis can be quite normative, underlining the need for a critical perspective. This is also the case in qualitative methods, yet there it strongly depends on the specific methods, as these methods are more diverse. Concerning the interpretation of scientific results, the amount and diversity of biases is clearly the highest, or in other words, worst. While this is related to the cognition bias we have as individuals, it is also related to prejudices, bias in academia and statistical bias. Overall, we need to recognise that some methods are less associated to certain biases because they are more established concerning the norms of their application, while other methods are new and less tested by the academic community. When it comes to bias, there can be at least a weak effect that safety - although not diversity - concerning methods comes in numbers. More and diverse methods may offer new insights on biases, since one method may reveal a bias that another method cannot reveal. Methodological plurality may reduce bias. For a fully established method the understanding of its bias is often larger, because the number of times it has been applied is larger. This is especially but not always true for the analysis step, and in parts also for some methodological designs concerned with sampling. Clear documentation is however key to make bias more visible among the three stages.

Bias and philosophy

The last and by far most complex point is the root theories associated to bias. Reason, social contract and utilitarianism are the three key theories of Western philosophy relevant for empiricism, and all biases can be at least associated to one of these three foundational theories. Many cognitive bias are linked to reason or unreasonable behaviour. Much of bias relates to prejudices and society can be linked to the wide field of social contract. Lastly, some bias is clearly associated with utilitarianism. Surprisingly, utilitarianism is associated to a low amount of bias, yet it should be noted that the problem of causality within economical analysis is still up for debate. Much of economic management is rooted in correlative understandings, which are often mistaken for clear-cut causal relations. Psychology also clearly illustrates that investigating a bias is different from unconsciously inferring a bias into your research. Consciousness of bias is the basis for its recognition : if you are not aware of bias, you cannot take it into account regarding your knowledge production. While it thus seems not directly helpful to associate empirical research and its biases to the three general foundational theories of philosophy - reason, social contract and utilitatrianism -, we should still take this into account, least of all at it leads us to one of the most important developments of the 20th century: Critical Theory.

Critical Theory and Bias

Out of the growing empiricism of the enlightenment there grew a concern which we came to call Critical Theory. At the heart of critical theory is the focus on critiquing and changing society as a whole, in contrast to only observing or explaining it. Originating in Marx, Critical Theory consists of a clear distancing from previous theories in philosophy - or associated with the social - that try to understand or explain. By embedding society in its historical context (Horkheimer) and by focussing on a continuous and interchanging critique (Benjamin) Critical Theory is a first and bold step towards a more holistic perspective in science. Remembering the Greeks and also some Eastern thinkers, one could say it is the first step back to a holistic thinking. From a methodological perspective, Critical Theory is radical because it seeks to distinguish itself not only from previously existing philosophy, but more importantly from the widely dominating empiricism, and its societal as well as scientific consequences. A Critical Theory should thus be explanatory, practical and normative, and what makes it more challenging, it needs to be all these three things combined (Horkheimer). Through Habermas, Critical Theory got an embedding in democracy, yet with a critical view of what we could understand as globalisation and its complex realities. The reflexive empowerment of the individual is as much as a clear goal as one would expect, also because of the normative link to the political.

Critical Theory is thus a vital step towards a wider integration of diverse philosophies, but also from a methodological standpoint it is essential since it allowed for the emergence of a true and holistic critique of everything empirical. While this may be valued as an attack, is can also be interpreted as necessary step, since the arrogance and the claim of truth in empiricism can be interpreted not only as a deep danger to methods. Popper does not offer a true solution to positivism, and indeed he was very much hyped by many. His thought that the holy grail of knowledge can ultimately be never truly reached also generates certain problems. He can still be admired because he called for scientists to be radical, while acknowledging that most scientists are not radical. In addition, we could see it from a post-modernist perspective as a necessary step to prevent an influence of empiricism that might pose a threat to and by humankind itself, may it be through nuclear destruction, the unachievable and feeble goal of a growth economy (my wording), the naive and technocratic hoax of the eco modernists (also my wording) or any other paradigm that is short-sighted or naive. In other words, we look at the postmodern.

Critical Theory to this end is now developing to connect to other facets of the discourse, and some may argue that its focus onto the social science can be seen critical in itself, or at least as a normative choice that is clearly anthropocentric, has a problematic relationship with the empirical, and has mixed relations with its diverse offspring that includes gender research, critique of globalisation, and many other normative domains that are increasingly explored today. Building on the three worlds of Popper (the physical world, the mind world, human knowledge), we should note another possibility, that is Critical Realism. Roy Bhaskar proposed three ontological domains ( strata of knowledge ): the real (which is everything there is ), the actual ( everything we can grasp ), and the empirical ( everything we can observe ). During the last decade, humankind unlocked ever more strata of knowledge, hence much of the actual became empirical to us. We have to acknowledge that some strata of knowledge are hard to relate, or may even be unrelatable, which has consequences for our methodological understanding of the world. Some methods may unlock some strata of knowledge but not others. Some may be specific, some vague. And some may only unlock new strata based on a novel combinations. What is most relevant to this end is however, that we might look for causal links, but need to be critical that new strata of knowledge may make them obsolete. Consequently, there are no universal laws that we can thrive for, but instead endless strata to explore.

Coming back to bias, Critical Theory seems as an antidote to bias , and some may argue Critical Realism even more so, as it combines the criticality with a certain humbleness necessary when exploring the empirical and causal. The explanatory characteristic allowed by Critical Realism might be good enough for the pragmatist, the practical may speak to the modern engagement of science with and for society, and the normative is aware of – well - all things normative, including the critical. Hence a door was opened to a new mode of science, focussing on the situation and locatedness of research within the world. This was surely a head start with Kant, who opened the globe to the world of methods. There is however a critical link in Habermas, who highlighted the duality of the rational individual on a small scale and the role of global societies as part of the economy (Habermas 1987). This underlines a crucial link to the original three foundational theories in philosophy, albeit in a dramatic and focused interpretation of modernity. Habermas himself was well aware of the tensions between these two approaches – the critical and the empirical -, yet we owe it to Critical Theory and its continuations that a practical and reflexive knowledge production can be conducted within deeply normative systems such as modern democracies.

Linking to the historical development of methods, we can thus clearly claim that Critical Theory (and Critical Realism) opened a new domain or mode of thinking, and its impact can be widely felt way beyond the social science and philosophy that it affected directly. However, coming back to bias, the answer to an almost universal rejection of empiricism will not be followed here . Instead, we need to come back to the three foundational theories of philosophy, and need to acknowledge that reason, social contract and utilitarianism are the foundation of the first empirical disciplines that are at their core normative (e.g. psychology, social and political science, and economics). Since bias can be partly related to these three theories, and consequentially to specific empirical disciplines, we need to recognise that there is an overarching methodological bias. This methodological bias has a signature rooted in specific design criteria, which are in turn related to specific disciplines. Consequently, this methodological bias is a disciplinary bias - even more so, since methods may be shared among scientific disciplines, but most disciplines claim either priority or superiority when it comes to the ownership of a method.

The disciplinary bias of modern science thus creates a deeply normative methodological bias, which some disciplines may try to take into account yet others clearly not. In other words, the dogmatic selection of methods within disciplines has the potential to create deep flaws in empirical research, and we need to be aware and reflexive about this. The largest bias concerning methods is the choice of methods per se. A critical perspective is thus not only of relevance from a perspective of societal responsibility, but equally from a view on the empirical. Clear documentation and reproducibility of research are important but limited stepping stones in a critique of the methodological. This cannot replace a critical perspective, but only amends it. Empirical knowledge will only look at parts - or strata according to Roy Bhaskar - of reality, yet philosophy can offer a generalisable perspective or theory, and Critical Theory, Critical Realism as well as other current developments of philosophy can be seen as a thriving towards an integrated and holistic philosophy of science, which may ultimately link to an overaching theory of ethics (Parfit). If the empirical and the critical inform us, then both a philosophy of science and ethics may tell us how we may act based on our perceptions of reality.

Further Information

Some words on Critical Theory A short entry on critical realism

The author of this entry is Henrik von Wehrden.

  • Normativity of Methods

Powered by MediaWiki


How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.


Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

does individual critical thinking hold any bias

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

Book Insights

Credibility: How Leaders Gain and Lose It, Why People Demand It

James Kouzes and Barry Posner

Project Team Development

Understanding Phases of Team Development Can Help Them Attain Peak Performance Quickly

Add comment

Comments (1)

priyanka ghogare

does individual critical thinking hold any bias

Get 30% off your first year of Mind Tools

Great teams begin with empowered leaders. Our tools and resources offer the support to let you flourish into leadership. Join today!

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Member Extras

Most Popular

Latest Updates

Article a8yivbd

Starting a New Job

Article am6050u

The Role of a Facilitator

Mind Tools Store

About Mind Tools Content

Discover something new today

Decision-making mistakes and how to avoid them.

Explore some common decision-making mistakes and how to avoid them with this Skillbook

Using Decision Trees

What decision trees are, and how to use them to weigh up your options

How Emotionally Intelligent Are You?

Boosting Your People Skills


What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

The program management team.

How a Program Management Organization is Formed and What is Involved in the Key Roles

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools


Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

Member Podcast

Kendall College of Art & Design

Critical Thinking & Evaluating Information

  • Critical Thinking Skills
  • Critical Thinking Questions
  • Fake News & Misinformation
  • Checkers & Cheat Sheets
  • Evaluate Using T.R.A.A.P.
  • Alternate Videos
  • Sources & Links

What is Bias?

                Sources of bias image bubble

Biases also play a role in how you approach all information. The short video below provides definitions of 12 types of cognitive biases.

There are two forms of bias of particular importance given today's information laden landscape, implicit bias and confirmation bias .

Implicit Bias & Confirmation Bias

Implicit / Unconscious Bias 

"Original definition (neutral) - Any personal preference, attitude, or expectation that unconsciously affects a person's outlook or behaviour.

Current definition (negative) - Unconscious favouritism towards or prejudice against people of a particular race, gender, or group that influences one's actions or perceptions; an instance of this."

"unconscious bias, n." OED Online, Oxford University Press, December 2020, .

"Thoughts and feelings are “implicit” if we are unaware of them or mistaken about their nature. We have a bias when, rather than being neutral, we have a preference for (or aversion to) a person or group of people. Thus, we use the term “implicit bias” to describe when we have attitudes towards people or associate stereotypes with them without our conscious knowledge."

Confirmation Bias – "Originating in the field of psychology; the tendency to seek or favour new information which supports one’s existing theories or beliefs, while avoiding or rejecting that which disrupts them." 

Addition of definition to the Oxford Dictionary in 2019 

"confirmation, n." OED Online, Oxford University Press, December 2020, 

Simply put, confirmation bias is the tendency to seek out and/ or interpret new information as confirmation of one's existing beliefs or theories and to exclude contradictory or opposing information or points of view.

Put Bias in Check!

                Who, what, when, where, why, how blocks image

Now that you are aware of bias, your personal biases and bias that can be found in sources of information, you can put it in check . You should approach information objectively, neutrally and critically evaluate it. Numerous tools included in this course can help you do this, like the critical thinking cheat sheet in the previous module.

  • << Previous: Critical Thinking Questions
  • Next: Evaluating News & Media >>
  • Last Updated: Sep 9, 2021 12:09 PM
  • URL:

Ferris State University Imagine More

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Signs of Different Types of Biases and How to Overcome Each of Them

These biases can unknowingly impact your thoughts and behaviors.

Wendy Rose Gould is a lifestyle reporter with over a decade of experience covering health and wellness topics.

does individual critical thinking hold any bias

Dr. Sabrina Romanoff, PsyD, is a licensed clinical psychologist and a professor at Yeshiva University’s clinical psychology doctoral program.

does individual critical thinking hold any bias

Fizkes / Getty Images

  • Confirmation Bias

Attribution Bias

Conformity bias, beauty bias, gender bias, the contrast effect.

Bias refers to a tendency or preference towards a certain group, idea, or concept that influences our judgments and decisions.

Our experiences, culture, social norms, and personal beliefs often shape these beliefs. The way we act on these biases can be either conscious or unconscious and can lead to prejudiced or discriminatory behaviors.

“Bias can play a significant role in day-to-day interactions and relationships, often influencing our thoughts, attitudes, and behaviors toward others,” says David Yadush, LPC, a licensed professional counselor at BetterHelp . “This can result in misinterpreting or overlooking facts and can change how we perceive people or events in our lives.”

Along with affecting our everyday interactions, being unaware of biases—or falling prey to them even when we know they exist—can hinder personal growth .

In this article, we’re outlining common types of biases, and discussing the signs of each type and ways to overcome them.

Why It's Important to Assess Your Biases

In order to recognize and work through bias, it’s important for us to challenge our assumptions and the subconscious stereotypes we make on a daily basis. This can be done by seeking out diverse perspectives, enjoying new experiences, and advocating for equal opportunity and treatment for everyone.

Of course, it also helps to understand the types of biases we’re apt to fall prey to so we can recognize and correct them in real-time.

How to Work Through Confirmation Bias

Confirmation bias is the tendency to seek out information that reaffirms our existing beliefs. In doing so, we tend to ignore information that contradicts our beliefs , which can lead us toward untruths.

Signs of confirmation bias may include:

  • Seeking information that confirms our beliefs
  • Ignoring or dismissing information that contradicts our beliefs
  • Failing to consider alternative opinions

“This bias can be harmful as it may prevent individuals from considering alternative viewpoints and may lead to closed-mindedness,” warns Yadush.

How to Overcome Confirmation Bias

“To recognize and work through confirmation bias, individuals should actively seek out diverse perspectives and information, consider alternative viewpoints, and engage in critical thinking and self-reflection ," says Yadush.

Attribution bias is a cognitive distortion where we view the behavior of others as impacted by internal motivation —such as morals and character—while considering your own behaviors as affected by external factors, such as circumstances and environment.

Signs of attribution bias may include:

  • Consistently blaming others for problems or failures
  • Being overly critical of others
  • Excusing our own mistakes without reflection

“Simply speaking, one tends to give themselves a break for their own mistakes or shortcomings as unavoidable but will blame others for similar mistakes or shortcomings as intentional,” explains Karri Francisco, LMFT, director of family programming at APN .

She says that this is intellectually dangerous because it leads to unfair judgments of others. It can also make it harder to learn from our own mistakes since this bias prevents us from taking responsibility for our actions.

How to Overcome Attribution Bias

Francisco says that practicing empathy and perspective-taking can help you move away from falling prey to attribution bias.

Conformity bias is when we simply agree—or conform—with the opinions and behaviors of others in a group setting even when it’s against our own personal beliefs or knowledge.

Signs of conformity bias may include:

  • Vocally agreeing with others even when you inwardly disagree
  • Not sharing your own thoughts and feelings out of fear of being “ousted” or judged in a group setting
  • Going along with a group that’s acting irresponsibly or cruelly when you know inwardly the behavior is wrong

Yadush says, “This is typically an unconscious process that we go through in an attempt to avoid social rejection or gain status. This bias can be harmful as it may prevent individuals from expressing their true thoughts and opinions and may lead to groupthink, where the desire for consensus overrides critical thinking.”

How to Overcome Conformity Bias

To recognize and work through conformity bias, focus on reflecting on your own beliefs and values. At the same time, you can engage in critical thinking and seek diverse perspectives and opinions from others.

If you’re in a leadership position , you can also reduce conformity bias by encouraging and rewarding diverse opinions. 

Beauty bias is either a subconscious or known propensity to treat conventionally beautiful people better or worse than those who aren’t as attractive .

Signs of beauty bias include:

  • You judge others on their appearance
  • You make assumptions about a conventionally attractive person’s capabilities
  • You treat others better or worse based on their appearance

The Halo Effect

The halo effect describes the phenomenon in which people assume that because a person has one favorable attribute, it must mean something favorable about them as a whole.

For example, if you think someone is attractive, you may assume that they are nicer or smarter than someone you deem less attractive.

For example, you might give favorable treatment to a beautiful person, or view them as more funny or interesting. This is referred to as The Halo Effect , and studies show that people have a tendency to do this without even thinking.

That said, you might also treat an unattractive person less favorably or make harsh judgments about them without getting to know them.

How to Overcome Beauty Bias

Francisco says, “The potential harm can lead to discrimination against those who do not present within conventional beauty standards. Are you making assumptions about a person's abilities or character based on their physical appearance, such as assuming that someone attractive is also intelligent or competent?”

She adds that in order to recognize and work through any bias, we must become aware of our own and challenge them as they occur.

One approach to challenging beauty bias is consciously focusing on a person's qualities and abilities when evaluating them.

Gender bias refers to the tendency we have to hold stereotypical or discriminatory attitudes towards people based solely on their gender. This not only affects our ability to socialize in meaningful ways, but it can also lead to unequal opportunities and treatment for others.

Signs of gender bias may include:

  • Making assumptions or judgments based on gender
  • Using gender-specific language
  • Treating individuals differently based on their gender

How to Overcome Gender Bias

According to Yadush, "To recognize and work through gender bias, individuals should challenge their assumptions and stereotypes and use gender-neutral language.”

Yadush adds that it’s also important to listen to and believe individuals about their experiences around gender bias and discrimination.

Similarly, ageism is the tendency we have to make judgments or assumptions about another person simply because of their age.

This tends to negatively impact people who are either young or old, as we subconsciously hold stereotypes about their capabilities or the “known characteristics” of their generation.

Signs of ageism may include:

  • Judging an individual's ability or intelligence based on age
  • Not interacting with someone because they’re a different age
  • Being rude or dismissive of others due to their age

Ageism and Its Impact on Mental Health

Yadush says that ageism has been shown to have serious effects on the mental health , physical health, and overall quality of life in the older adult population. It can hinder their ability to socialize, find employment, or make meaningful friendships.

For young people, it can also impact their ability to be taken seriously in professional settings. This is also referred to as "youngism."

How to Overcome Ageism

“To help combat ageism, seek out mentorship from individuals of all ages and be willing to learn from those with different lived experiences,” Yadush suggests. “When you do recognize ageism in the workplace or community, speak out and be an advocate as others may not have the opportunity or support to do so.”

The contrast effect tends to sneak up on us. It’s a cognitive bias where the comparison of two things influences your perception of both.

Other signs of the contrast effect include:

  • Comparing one person to another
  • Failing to focus on objective criteria when making decisions
  • Not considering the context of your evaluations

Karri Francisco, LMFT

[The contrast effect] can lead to inaccurate perceptions and judgments of individuals being evaluated in comparison to another.

How the Contrast Effect May Play Out in Everyday Life

Here are some examples of what the contrast effect may look like in the real world:

  • If you see someone casually dressed and standing next to someone looking unkempt, the casual attire may appear more professional in comparison. This might not seem important, but it demonstrates an important effect.
  • In another example, if someone is interviewed for a job immediately after a particularly impressive candidate, they may be judged more harshly than they would have been if someone had interviewed them alone. This creates space for perceptions to be distorted. 

“The contrast effect highlights how our perceptions are not solely based on objective measurements but can be influenced by the context in which we experience them,” explains Francisco. “This can lead to inaccurate perceptions and judgments of individuals being evaluated in comparison to another.”

How to Overcome the Contrast Effect

When making decisions, try to be as objective as possible. If you do have to make any comparisons, it can help to take breaks between comparisons and evaluations in order to clear your mind of influences, and to focus on objective criteria rather than subjective impressions.

We're all prone to cognitive distortions. Sometimes we're on the receiving end, while other times we're the ones making quick judgments. Reflecting on where these biases may exist in your daily life is the first step in understanding and overcoming them.

Merriam-Webster Dictionary. Halo effect .

Batres C, Shiramizu V.  Examining the “attractiveness halo effect” across cultures .  Curr Psychol . Published online August 25, 2022. doi:10.1007/s12144-022-03575-0

Francioli SP, North MS.  Youngism: The content, causes, and consequences of prejudices toward younger adults .  J Exp Psychol Gen . 2021;150(12):2591-2612. doi:10.1037/xge0001064

By Wendy Rose Gould Wendy Rose Gould is a lifestyle reporter with over a decade of experience covering health and wellness topics.

inna kot/shutterstock

Cognitive Biases, Discrimination, Heuristics, Prejudice, Stereotypes, Racism, Sexism, Self-Serving Bias, Actor/Observer Bias, Change Bias

Reviewed by Psychology Today Staff

A bias is a tendency, inclination, or prejudice toward or against something or someone. Some biases are positive and helpful—like choosing to only eat foods that are considered healthy or staying away from someone who has knowingly caused harm. But biases are often based on stereotypes, rather than actual knowledge of an individual or circumstance. Whether positive or negative, such cognitive shortcuts can result in prejudgments that lead to rash decisions or discriminatory practices.

  • Bias and Stereotyping
  • Biases and Cognitive Errors

Angelina Bambina/Shutterstock

Bias is often characterized as stereotypes about people based on the group to which they belong and/or based on an immutable physical characteristic they possess, such as their gender , ethnicity , or sexual orientation . This type of bias can have harmful real-world outcomes. People may or may not be aware that they hold these biases.

The phenomenon of implicit bias refers to societal input that escapes conscious detection. Paying attention to helpful biases—while keeping negative, prejudicial, or accidental biases in check—requires a delicate balance between self-protection and empathy for others.

Bias is a natural inclination for or against an idea, object, group, or individual. It is often learned and is highly dependent on variables like a person’s socioeconomic status, race, ethnicity, educational background, etc. At the individual level, bias can negatively impact someone’s personal and professional relationships; at a societal level, it can lead to unfair persecution of a group, such as the Holocaust and slavery.

Starting at a young age, people will discriminate between those who are like them, their “ingroup,” and those who are not like them, “their outgroup.” On the plus side, they can gain a sense of identity and safety. However, taken to the extreme, this categorization can foster an “us-versus-them” mentality and lead to harmful prejudice .

People are naturally biased—they like certain things and dislike others, often without being fully conscious of their prejudice. Bias is acquired at a young age, often as a result of one’s upbringing. This unconscious bias becomes problematic when it causes an individual or a group to treat others poorly as a result of their gender, ethnicity, race, or other factors. 

Generally, no. Everyone has some degree of bias . It’s human nature to assign judgment based on first impressions. Also, most people have a lifetime of conditioning by schools, religious institutions, their families of origin, and the media. However, by reflecting critically on judgments and being aware of blind spots, individuals can avoid stereotyping and acting on harmful prejudice.

Telling people to “suppress prejudice” or racism often has the opposite effect. When people are trained to notice prejudiced or racist thoughts without trying to push them away, they are able to make a deliberate choice about how they behave towards others as a result. This can lead to less discrimination and reduced bias over time.

gustavo frazao/shutterstock

A category of biases, known as cognitive biases, are repeated patterns of thinking that can lead to inaccurate or unreasonable conclusions. Cognitive biases may help people make quicker decisions, but those decisions aren’t always accurate. Some common reasons why include flawed memory , scarce attention, natural limits on the brain’s ability to process information, emotional input, social pressures, and even aging. When assessing research—or even one's own thoughts and behaviors—it’s important to be aware of cognitive biases and attempt to counter their effects whenever possible.

When you are the actor, you are more likely to see your actions as a result of external and situational factors . Whereas, when you are observing other people, you are more likely to perceive their actions as based on internal factors (like overall disposition). This can lead to magical thinking and a lack of self-awareness.

People tend to jump at the first available piece of information and unconsciously use it to “anchor” their decision-making process , even when the information is incorrect or prejudiced. This can lead to skewed judgment and poor decision-making , especially when they don’t take the time to reason through their options.

Attribution bias occurs when someone tries to attribute reasons or motivations to the actions of others without concrete evidence to support such assumptions.

Confirmation bias refers to the brain’s tendency to search for and focus on information that supports what someone already believes, while ignoring facts that go against those beliefs, despite their relevance.

People with hindsight bias believe they should have anticipated certain outcomes , which might only be obvious now with the benefit of more knowledge and perspective. They may forget that at the time of the event, much of the information needed simply wasn’t available. They may also make unfair assumptions that other people share their experiences and expect them to come to the same conclusions.

In the Dunning-Kruger Effect , people lack the self-awareness to accurately assess their skills. They often wind up overestimating their knowledge or ability. For example, it’s not uncommon to think you’re smarter, kinder, or better at managing others than the average person.

People are more likely to attribute someone else’s actions to their personality rather than taking into account the situation they are facing. However, they rarely make this Fundamental Attribution Error when analyzing their own behavior.

The Halo Effect occurs when your positive first impression of someone colors your overall perception of them. For example, if you are struck by how beautiful someone is, you might assume they have other positive traits, like being wise or smart or brave. A negative impression, on the first hand, can lead you to assume the worst about a person, resulting in a “Reverse Halo” or “ Horns Effect .” 

People like to win, but they hate losing more. So they tend to pay more attention to negative outcomes and weigh them more heavily than positive ones when considering a decision. This negativity bias explains why we focus more on upsetting evens, and why the news seems so dire most of the time.

People tend to overestimate the likelihood of positive outcomes when they are in a good mood. Conversely, when they are feeling down, they are more likely to expect negative outcomes. In both instances, powerful emotions are driving irrational thinking .

Have you ever heard, “Don’t throw good money after bad”? That expression is based on the Sunk Cost Fallacy. Basically, when someone is aware of the time, effort, and emotional cost that’s already gone into an endeavor, they can find it difficult to change their mind or quit a longtime goal —even when it’s the healthiest choice for them.

does individual critical thinking hold any bias

A riveting new book examines the history—and price—of exposing corruption in medical research.

does individual critical thinking hold any bias

Not being believed, especially if their stalker is a female, is one of the major barriers that victims of stalking often encounter when trying to access support and protection. 

does individual critical thinking hold any bias

A Personal Perspective: The U.S. Supreme Court overturned affirmative action using Asian Americans' stories. What do Asian Americans think of affirmative action?

does individual critical thinking hold any bias

Masculine defaults produce unattractive workplace environments for women in prestigious fields. The solution is not to train women to act like typical men.

does individual critical thinking hold any bias

A simple exercise can shift your attention from fearful and negative to positive. This can dramatically shift your emotional state.

does individual critical thinking hold any bias

Navigating the delicate balance between selflessness and self-interest. From the evolutionary roots of altruism to its modern-day manifestations.

does individual critical thinking hold any bias

How well do scientists know how effectively they apply research ethics? Probably not so well, but that should come as no surprise.

does individual critical thinking hold any bias

Here’s a way to understand how attachment and bonding in infancy shape one’s sense of self, degree of emotional security, and capacity for independent thinking.

does individual critical thinking hold any bias

Artificial intelligence already plays a role in deciding who’s getting hired. The way to improve AI is very similar to how we fight human biases.

does individual critical thinking hold any bias

Language that centers persons and does not distance can help us shape a healthier, less divided society.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience


  1. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  2. Cognitive Bias Is the Loose Screw in Critical Thinking

    People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality. Cognitive biases are mental shortcuts people take in order to process ...

  3. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    Confirmation Bias. One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs.Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality.

  4. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  5. The Impact of Cognitive Biases on Professionals' Decision-Making: A

    Introduction. When making judgments or decisions, people often rely on simplified information processing strategies called heuristics, which may result in systematic, predictable errors called cognitive biases (hereafter CB). For instance, people tend to overestimate the accuracy of their judgments (overconfidence bias), to perceive events as being more predictable once they have occurred ...

  6. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  7. Bias

    Becoming set in our thinking, once an opinion has been formed, and deliberately ignoring any new information on the topic; A tendency to be overconfident with the validity of our held beliefs. Peters (2020) also suggests that we're more likely to remember information that supports our way of thinking, further cementing our bias. Taken ...

  8. 2.2 Overcoming Cognitive Biases and Engaging in Critical ...

    Confirmation Bias. One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs.Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality.

  9. Bias and Critical Thinking

    Here, we show three different perspective in order to enable a more reflexive understanding of bias. The first is the understanding how different forms of biases relate to design criteria of scientific methods. The second is the question which stage in the application of methods - data gathering, data analysis, and interpretation of results ...

  10. 5 Inhibitors of Critical Thinking

    Of course, that's particularly dangerous when the person has a megaphone: teachers, professors, and the media. Do try to think less like an activist and more like a statesman. 5 ...

  11. Cognitive Biases and Their Influence on Critical Thinking and

    The conservatism bias is cognitive and causes people to hold on to their previous opinions and idea frames even thoug h facts have chang ed." This may help explain why HR professionals are

  12. Ignorance, misconceptions and critical thinking

    This notion of critical thinking does not merely describe a way of thinking, ... it is not impossible for an individual to hold a belief that is inconsistent with other beliefs s/he holds, and yet it would be irrational for him/her to do so. ... Critical thinking and cognitive bias. Informal Logic, 35(2), 183-203. Article Google Scholar

  13. Defeating Unconscious Bias: The Role of a Structured, Reflective, and

    Methods. From April 2019 to June 2020, a 90-minute educational workshop was attended by students, residents, and faculty. The curriculum included an interactive unconscious biases presentation, videoclips using vignettes to demonstrate workplace impact of unconscious biases with strategies to counter, and reflective group discussions. The ...

  14. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  15. Critical Thinking, Intelligence, and Unsubstantiated Beliefs: An

    Endorsing them can have a negative impact on an individual and society at large. For example, false beliefs about the ... studied myside bias, a bias in reasoning closely related to one-sided thinking and confirmation bias. A critical thinker would be expected to not show myside bias and instead fairly evaluate evidence on all sides of a ...

  16. Critical Thinking

    Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...

  17. LibGuides: Critical Thinking & Evaluating Information: Bias

    Confirmation Bias - "Originating in the field of psychology; the tendency to seek or favour new information which supports one's existing theories or beliefs, while avoiding or rejecting that which disrupts them." Addition of definition to the Oxford Dictionary in 2019. "confirmation, n." OED Online, Oxford University Press, December 2020 ...

  18. 7 Ways to Improve Critical Thinking and Challenge Brain Bias

    Here are seven ways to improve critical thinking and challenge brain bias: Look for alternative premises: Our brain is wired to interpret information through existing contexts causing us to overly focus on information that supports our current beliefs. Counteracting this requires a deliberate shift in focus. If you believe something to be so ...

  19. 12 Common Biases That Affect How We Make Everyday Decisions

    Remember one of my "5 Tips for Critical Thinking": Leave emotion at the door. 6. The Sunk Cost Fallacy. Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty ...

  20. Cognitive Bias List: 13 Common Types of Bias

    The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases. These biases distort thinking, influence beliefs, and sway the decisions and judgments that people make each and every day.

  21. The 'whys' and 'whens' of individual differences in thinking biases

    Framing the 'whys' and 'whens' of individual divergence. (A) Bias and the resulting divergence between biased and unbiased reasoners can result from a failure in each of at least three elementary components of the reasoning process: storage, monitoring, and inhibition. Key positions in the bias debate differ as to which of these ...

  22. 7 Different Types of Bias and How to Work Through Them

    Confirmation Bias. Attribution Bias. Conformity Bias. Beauty Bias. Gender Bias. Ageism. The Contrast Effect. Bias refers to a tendency or preference towards a certain group, idea, or concept that influences our judgments and decisions. Our experiences, culture, social norms, and personal beliefs often shape these beliefs.

  23. Overcoming Biases For Developing Critical Thinking

    Basic steps for critical thinking: Identification: Identify the situation or issue deeply. Then figure out who is going to influence it or is getting affected by this. Once you have a precise scenario of the case, you can begin diving deeper into the issue. Research: Research the topic properly before taking any step or decision.

  24. Distinguishing Critical Thinking Bias in BI

    Unbiased critical thinking requires a disciplined approach to evaluating information, free from personal prejudices or organizational bias. It is the backbone of effective BI, enabling you to make ...

  25. Bias

    Bias is a natural inclination for or against an idea, object, group, or individual. It is often learned and is highly dependent on variables like a person's socioeconomic status, race, ethnicity ...