• Exploring the SCAMPER Technique for Creative Problem Solving
  • Exploring Brainwalking: A Creative Problem-Solving Technique
  • Logic Puzzles and Brain Teasers: A Comprehensive Overview
  • Exploring the Serendipity Technique of Creative Problem Solving
  • Analytical problem solving
  • Identifying root causes
  • Analyzing consequences
  • Brainstorming solutions
  • Heuristic problem solving
  • Using analogies
  • Applying existing solutions
  • Trial and error
  • Creative problem solving
  • Mind mapping
  • Brainstorming
  • Lateral thinking
  • Research skills
  • Interpreting information
  • Data collection and analysis
  • Identifying patterns
  • Critical thinking skills
  • Recognizing bias
  • Analyzing arguments logically
  • Questioning assumptions
  • Communication skills
  • Negotiation and compromise
  • Listening skills
  • Explaining ideas clearly
  • Planning techniques
  • SWOT analysis
  • Gantt charting
  • Critical path analysis
  • Decision making techniques
  • Force field analysis
  • Paired comparison analysis
  • Cost-benefit analysis
  • Root cause analysis
  • Five whys technique
  • Fault tree analysis
  • Cause and effect diagrams
  • Brainstorming techniques
  • Brainwriting
  • Brainwalking
  • Round-robin brainstorming
  • Creative thinking techniques
  • Serendipity technique
  • SCAMPER technique
  • Innovation techniques
  • Value innovation techniques
  • Design thinking techniques
  • Idea generation techniques
  • Personal problems
  • Deciding what career to pursue
  • Managing finances effectively
  • Solving relationship issues
  • Business problems
  • Increasing efficiency and productivity
  • Improving customer service quality
  • Reducing costs and increasing profits
  • Environmental problems
  • Preserving natural resources
  • Reducing air pollution levels
  • Finding sustainable energy sources
  • Individual brainstorming techniques
  • Thinking outside the box
  • Word association and random word generation
  • Mind mapping and listing ideas
  • Group brainstorming techniques
  • Synectics technique
  • Online brainstorming techniques
  • Online whiteboarding tools
  • Virtual brainstorming sessions
  • Collaborative mind mapping software
  • Team activities
  • Group decision making activities
  • Debate activities and role-play scenarios
  • Collaborative problem solving games
  • Creative activities
  • Creative writing exercises and storyboards
  • Imagination activities and brainstorming sessions
  • Visualization activities and drawing exercises
  • Games and puzzles
  • Crossword puzzles and Sudoku
  • Logic puzzles and brain teasers
  • Jigsaw puzzles and mazes
  • Types of decisions
  • Structured decisions
  • Simple decisions
  • Complex decisions
  • Problem solving skills
  • Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

Learn how to identify and address bias in decision making with our guide to recognizing bias in problem solving and critical thinking.

Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

In today's world, it is becoming increasingly important to recognize bias and how it can affect our decision-making. Bias can cloud our judgement, lead us to make decisions that are not in our best interests, and limit our ability to solve problems effectively. In this guide, we will explore the concept of recognizing bias and how it can be used as a tool for developing critical thinking and problem-solving skills. We will discuss the various types of biases, why recognizing them is important, and how to identify and counteract them.

Confirmation bias

Cognitive bias.

This type of bias can lead to unfair judgments or decisions. Other common types of bias include cultural bias, which is the tendency to favor one’s own culture or group; and political bias, which is the tendency to favor one’s own political party or beliefs. In order to identify and address bias in oneself and others, it is important to be aware of potential sources of bias. This includes personal opinions, values, and preconceived notions. Being mindful of these potential sources of bias can help us become more aware of our own biases and recognize them in others.

Additionally, it is important to be open-minded and willing to consider alternative perspectives. Additionally, it is helpful to challenge our own assumptions and beliefs by questioning them and seeking out evidence that supports or refutes them. The potential implications of not recognizing or addressing bias are significant. If left unchecked, biases can lead to unfair decisions or judgments, as well as inaccurate conclusions. This can have serious consequences for individuals and organizations alike.

Implications of Not Recognizing or Addressing Bias

Strategies for identifying and addressing bias.

Recognizing bias in oneself and others is an important part of making informed decisions. There are several strategies that can be used to identify and address bias. One of the most effective strategies is to take a step back and look at the situation objectively. This involves examining the facts and assumptions that are being used to make decisions.

It can also involve assessing the potential impact of decisions on multiple stakeholders. By removing personal biases from the equation, it is possible to make more informed decisions. Another important strategy for identifying and addressing bias is to question the sources of information. It is important to consider the credibility of sources, as well as any potential biases that may be present.

Fact-checking sources and considering multiple perspectives can help identify any potential biases in the information being used. In addition, it is important to remain aware of our own biases. We all have preconceived notions about certain topics that can affect our decision-making process. By being mindful of our biases, we can avoid making decisions that are influenced by them. Finally, it is important to be open to other perspectives and willing to engage in meaningful dialogue with others.

Types of Bias

Halo effect, what is bias.

It can be an unconscious preference that influences decision making and can lead to adverse outcomes. It is important to recognize bias because it can have a negative impact on our ability to make sound decisions and engage in problem solving and critical thinking. Bias can manifest itself in various ways, from subtle mental shortcuts to overt prejudices. Types of bias include confirmation bias, where we seek out information that confirms our existing beliefs; availability bias, where we base decisions on the information that is most readily available; and representativeness bias, where we assume that two events or objects are related because they share similar characteristics. Other forms of bias include halo effect, where a single positive quality or trait can influence the perception of an entire person; and stereotyping, which is the tendency to make judgments about individuals based on their perceived membership in a certain group. It is important to recognize bias in ourselves and others so that we can make informed decisions and engage in problem solving and critical thinking.

Sources of Bias

Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence. Personal opinions and values can lead to biased decision-making. They can be shaped by past experiences, cultural background , and other personal factors. For example, someone's opinion about a certain topic may be based on what they have previously heard or read. Similarly, preconceived notions can also lead to biased conclusions. Cultural norms can also play a role in creating bias.

For instance, people may be more likely to believe information from a source they trust or respect, even if it is not based on fact. Similarly, people may be more likely to make decisions that conform to the expectations of their culture or society. In addition, people can also be influenced by their own prejudices or stereotypes. This type of bias can lead to unfair treatment of certain individuals or groups of people. Finally, it is important to be aware of the potential for confirmation bias, where people will seek out information that confirms their existing beliefs and disregard any contradictory evidence. By recognizing and understanding these sources of bias, people can make more informed decisions and engage in more effective problem solving and critical thinking.

In conclusion, recognizing and addressing bias is an essential part of problem solving and critical thinking. Bias can come from many sources, including our own beliefs, cultural norms, and past experiences. Knowing the types of bias and strategies for identifying and addressing them can help us make informed decisions and better engage in critical thinking. Taking time to reflect on our own biases is also important for making unbiased decisions.

Ultimately, recognizing and addressing bias will improve our problem-solving and critical thinking skills.

Mind Mapping: A Creative Problem Solving Tool

  • Mind Mapping: A Creative Problem Solving Tool

Learn all about mind mapping and how it can help you to solve problems creatively and effectively.

Visualization Activities and Drawing Exercises

  • Visualization Activities and Drawing Exercises

Learn more about visualization activities and drawing exercises, from problem-solving activities to creative activities.

Brainstorming Solutions: A Problem-Solving Guide

  • Brainstorming Solutions: A Problem-Solving Guide

Learn how to use brainstorming to come up with creative solutions to complex problems. Discover problem-solving strategies that work.

Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

  • Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

Learn about individual brainstorming techniques to help you think outside the box and come up with creative solutions.

  • Mind Mapping - Creative Problem Solving and Creative Thinking Techniques
  • Collaborative Problem Solving Games: Exploring Creative Solutions for Teams
  • Force Field Analysis for Problem Solving and Decision Making
  • Finding Sustainable Energy Sources

Improving Customer Service Quality

  • Value Innovation Techniques
  • Mind Mapping and Listing Ideas
  • Collaborative Mind Mapping Software
  • SWOT Analysis: A Comprehensive Overview
  • Fault Tree Analysis: A Comprehensive Overview
  • Data Collection and Analysis - Problem Solving Skills and Research Skills
  • Virtual Brainstorming Sessions: A Comprehensive Overview
  • Cause and Effect Diagrams: A Problem-Solving Technique

Exploring Trial and Error Problem Solving Strategies

  • Interpreting Information: A Problem-Solving and Research Skills Primer
  • Brainstorming: A Comprehensive Look at Creative Problem Solving
  • Gantt Charting: A Primer for Problem Solving & Planning Techniques
  • Debate Activities and Role-Play Scenarios
  • Design Thinking Techniques: A Comprehensive Overview
  • Cost-benefit Analysis: A Guide to Making Informed Decisions
  • Managing Your Finances Effectively
  • Idea Generation Techniques: A Comprehensive Overview
  • Structured Decisions: An Overview of the Decision Making Process
  • Preserving Natural Resources
  • Critical Path Analysis: A Comprehensive Guide
  • Maximizing Efficiency and Productivity
  • Crossword Puzzles and Sudoku: A Problem-Solving Exploration
  • Word Association and Random Word Generation
  • Paired Comparison Analysis: A Comprehensive Overview
  • Choosing the Right Career: Problem-Solving Examples
  • Brainwriting: A Creative Problem-Solving Technique
  • Applying Existing Solutions for Problem Solving Strategies
  • Identifying Patterns: A Practical Guide
  • Imagination Activities and Brainstorming Sessions
  • How to Explain Ideas Clearly
  • Analyzing Arguments Logically
  • Reducing Costs and Increasing Profits: A Problem Solving Example
  • Creative Writing Exercises and Storyboards
  • Exploring Synectics Technique: A Comprehensive Guide
  • Jigsaw Puzzles and Mazes: Problem Solving Activities for Fun and Learning
  • Brainwriting: A Group Brainstorming Technique
  • Questioning Assumptions: A Critical Thinking Skill
  • Analyzing Consequences: A Problem Solving Strategy
  • Identifying Root Causes
  • Exploring Lateral Thinking: A Comprehensive Guide to Problem Solving Strategies
  • Making Complex Decisions: A Comprehensive Overview
  • Round-robin Brainstorming: A Creative Problem Solving Tool
  • Solving Relationship Issues
  • Negotiation and Compromise

Using Analogies to Solve Problems

  • Five Whys Technique: A Comprehensive Analysis
  • Exploring Online Whiteboarding Tools for Brainstorming
  • Round-robin brainstorming: Exploring a Group Brainstorming Technique
  • Listening Skills: A Comprehensive Overview
  • Simple Decisions - An Overview
  • Reducing Air Pollution Levels
  • Group Decision Making Activities

New Articles

Improving Customer Service Quality

Which cookies do you want to accept?

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

  • Last updated
  • Save as PDF
  • Page ID 162135

  • Nathan Smith et al.

Learning Objectives

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

CONNECTIONS

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Click to view content

Confirmation Bias

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk Cost Fallacy

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Table 2.1 Common Cognitive Biases

Think Like A Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

Critical thinking

We’ve already established that information can be biased. Now it’s time to look at our own bias.

Studies have shown that we are more likely to accept information when it fits into our existing worldview, a phenomenon known as confirmation or myside bias (for examples see Kappes et al., 2020 ; McCrudden & Barnes, 2016 ; Pilditch & Custers, 2018 ). Wittebols (2019) defines it as a “tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe” (p. 211). Quite simply, we may reject information that doesn’t support our existing thinking.

This can manifest in a number of ways with Hahn and Harris (2014) suggesting four main behaviours:

  • Searching only for information that supports our held beliefs
  • Failing to critically evaluate information that supports our held beliefs - accepting it at face value - while explaining away or being overly critical of information that might contradict them
  • Becoming set in our thinking, once an opinion has been formed, and deliberately ignoring any new information on the topic
  • A tendency to be overconfident with the validity of our held beliefs.

Peters (2020) also suggests that we’re more likely to remember information that supports our way of thinking, further cementing our bias. Taken together, the research suggests that bias has a huge impact on the way we think. To learn more about how and why bias can impact our everyday thinking, watch this short video.

Filter bubbles and echo chambers

The theory of filter bubbles emerged in 2011, proposed by an Internet activist, Eli Pariser. He defined it as “your own personal unique world of information that you live in online” ( Pariser, 2011, 4:21 ). At the time that Pariser proposed the filter bubble theory, he focused on the impact of algorithms, connected with social media platforms and search engines, which prioritised content and personalised results based on the individuals past online activity, suggesting “the Internet is showing us what it thinks we want to see, but not necessarily what we should see” (Pariser, 2011, 3:47. Watch his TED talk if you’d like to know more).

Our understanding of filter bubbles has now expanded to recognise that individuals also select and create their own filter bubbles. This happens when you seek out likeminded individuals or sources; follow your friends or people you admire on social media; people that you’re likely to share common beliefs, points-of-view, and interests with. Barack Obama (2017) addressed the concept of filter bubbles in his presidential farewell address:

For too many of us it’s become safer to retreat into our own bubbles, whether in our neighbourhoods, or on college campuses, or places of worship, or especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions… Increasingly we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there. ( Obama, 2017, 22:57 ).

Filter bubbles are not unique to the social media age. Previously, the term echo chamber was used to describe the same phenomenon in the news media where different channels exist, catering to different points of view. Within an echo chamber, people are able to seek out information that supports their existing beliefs, without encountering information that might challenge, contradict or oppose.

Other forms of bias

There are many different ways in which bias can affect the way you think and how you process new information. Try the quiz below to discover some additional forms of bias, or check out Buzzfeed’s 2017 article on cognitive bias.

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

Search form

Bias on the brain: a yale psychologist examines common ‘thinking problems’.

Woo-kyoung Ahn

Woo-kyoung Ahn (Credit: Studio DUDA Photography)

The sometimes counterintuitive ways that our brains work can raise big questions. Why is it that we procrastinate, dragging our feet when we know we will regret it later? Why are misunderstandings and miscommunications so common? And why do people often turn a blind eye to evidence that contradicts their beliefs?

For Yale’s Woo-kyoung Ahn, the John Hay Whitney Professor of Psychology, a better understanding of these kinds of questions is crucial.

In her new book, “ Thinking 101: How to Reason Better to Live Better ,” Ahn explores the ins and outs of so-called “reasoning fallacies” — or, as she describes them, “thinking problems” —  and how they affect our lives, from influencing how we view time to why we stereotype and profile other people. Her work examines the inherent complexities in the science of how we think and ultimately promotes solutions to overcome these reasoning fallacies.

The book is based on lessons Ahn teaches in her undergraduate course “Thinking,” which has become one of Yale’s most popular courses.

In an interview with Yale News, Ahn discusses the nuances of these “thinking problems” and the steps we should take to reduce biases. The interview has been edited and condensed.

The book is based on your popular course, “Thinking.” What inspired you to create the class itself — and to condense a semester-long course into a book?

Woo-kyoung Ahn: I have been teaching over 30 years and I have always covered some of these materials in various courses, and I was also teaching a seminar and an upper-level psych course on thinking too. But in 2016, I felt that it was time to disseminate this content more broadly for those who are not majoring in psychology.

The inspiration is this: it’s now quite well known that people commit a variety of reasoning fallacies, but they have been mostly discussed in the context of behavioral economics, or, to put it differently, psychology in the economic context. For example, people talked about the “negativity bias” — the bias to overweigh negative information compared to positive [information] — in terms of loss aversion in transaction or endowment effects. But I believe these irrational behaviors influence us in other situations in our everyday life. And I wanted to teach that rationality matters not just in dealing with money [and other material things] but much more broadly.

Another inspiration is that there has not been much discussion about what to do next once we notice that we commit thinking biases. Merely recognizing them isn’t enough; it’s like one can recognize that they have insomnia, but that’s not enough to cure insomnia. So, I’ve provided as many actional strategies as possible.

Which concepts did you choose to highlight in the book – and how did you choose to focus on them?

Ahn: In the course I cover many topics, like creativity, moral judgments, effects of language on thoughts, among many others — which I might, or might not, cover in a book at another time. But to be honest, I really have no good answer as to why these were chosen; I could have written a completely different book with the same title! One thing that I tried was not to cover too much in a single book. There are now websites with titles like “61 Cognitive Biases.” I just don’t want to overwhelm the readers. I also cover in the course more technical issues, [like] models of causal learning or mathematical proofs for irrational choices, which I don’t think was needed for a book written for the general public.

One valuable insight that became a running theme for the book is that these thinking “problems” aren’t really about what is wrong with us. I’ve mentioned this issue in my course at times, but I became more convinced about it while writing the book.

How would you define or diagnose these “thinking problems”?

Ahn: The kind of bad thinking that I care most about is unfair thinking. We can be unfair to ourselves and also to others when we are inconsistent, biased, overconfident, or underconfident.

For example, I don’t think the well-known confirmation bias is necessarily a thinking problem. 

It refers to a tendency to confirm what we hypothesize or what we believe. That is, we tend to search for evidence that supports what we believe or interpret evidence to fit with what we know. That sounds pretty bad, but it is actually a quite adaptive mechanism. For instance, you go to a Stop & Shop three miles away from your house, and you find that their apples are good. Next time when you want apples, you could try a different supermarket, but if your goal is to just get good apples, you might as well go to the same supermarket again. That is, when our goal is to survive, it’s a better bet to continue with what you know rather than to try to explore other possibilities.

The confirmation bias becomes a thinking problem when it makes us draw conclusions that are unfair to ourselves or others. For example, let’s say a CEO of a company hires only white people for their top executive positions. They all do a reasonably good job, so now the CEO believes that race matters and continues hiring only white people.  However, this CEO hasn’t even checked what would have happened if non-white people had been hired for those positions.

The confirmation bias can even hurt those who commit it. An experiment I conducted illustrates this. In the experiment, participants carried out a saliva test. Half the participants were told that the test results indicated that they have elevated genetic risks for major depressive disorder, and the other half were told that they don’t have the risks. Then, we asked all participants about the symptoms of depression they experienced in the past two weeks. These participants were randomly assigned to one of the two conditions, so there’s no reason why one group would have been more depressed for the last two weeks than the other.

But those who learned that they have genetic risks for major depression reported that they were significantly more depressed, and their average score is higher than what could be clinically considered mild depression.

One recurring theme of the book is how we might better communicate ideas, especially when there is evidence that using metrics — like data, statistics, et cetera — may be ineffective. How do you think public messaging campaigns can best incorporate some of these ideas and concepts?

Ahn: I think public messaging campaigns should acknowledge how we’re wired. For instance, when a charity organization asks for donations, they shouldn’t use just statistics or abstract data, but also an anecdote of a person who suffers from the issues that they are concerned about. In my recent study, we presented highly disgusting pictures of COVID-19 to the participants: pictures of COVID toes, burials of those who died of COVID, et cetera. As we know, politically conservative people were generally less willing to comply with the [Centers for Disease Control and Prevention] guidelines than politically liberal people, at least at the time of the study. However, when presented with these vivid examples of those who suffer from COVID-19, they became much more willing to comply with the guidelines.

You seem to be a big proponent of promoting greater dialogue between people, in that doing so will help combat many of these cognitive biases we have.

Ahn: Again, I tried to underscore that those who commit cognitive biases are not bad people; these errors are part of our highly adaptive cognitive mechanisms. Let’s return to confirmation bias again to illustrate this one more time. My favorite example is what happened when my son was four years old. He asked me why a yellow traffic light is called a “yellow” light. I didn’t understand the question but was patient enough to tell him that it’s called a yellow light because it’s yellow. Then he told me it’s [not yellow, but] orange. I said no way. He insisted that I just look at it, so I did. And it is orange.

I had no ulterior motives to call it a yellow light or to interpret the amber color as being yellow, but because everybody called it a yellow light, I saw it as being yellow all my life until he pointed it out. But what I committed was confirmation bias; I interpreted the color of a traffic light in light of what I already believe. We do this every moment in our lives. Thus, we shouldn’t think that those who disagree with our views are different kinds of people; they are just seeing the world from their point of view.

Of course, there are people with ulterior motives and self-righteous people, but if we are to start a dialogue, we should first recognize [the biases] we all share.

Also, because these biases are essential components of how we survive, they are not easy to counteract. So, sometimes, I do present actionable strategies — don’t guess what others might like, ask! — but in some cases, we may just need to try to focus on solving problems at hand rather than trying to change others.

Campus & Community

Social Sciences

problem bias critical thinking

What economists can learn from economic history

problem bias critical thinking

Greater access to clean water, thanks to a better membrane

Lipid membrane illustration

Understanding bacteria protection in order to break through it

Cadets and midshipmen standing in a block

ROTC cadets, midshipmen parade — and honor Salovey — at President’s Review

  • Show More Articles

Warren Berger

A Crash Course in Critical Thinking

What you need to know—and read—about one of the essential skills needed today..

Posted April 8, 2024 | Reviewed by Michelle Quirk

  • In research for "A More Beautiful Question," I did a deep dive into the current crisis in critical thinking.
  • Many people may think of themselves as critical thinkers, but they actually are not.
  • Here is a series of questions you can ask yourself to try to ensure that you are thinking critically.

Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion about who and what to believe.

These are some of the hallmarks of the current crisis in critical thinking—which just might be the issue of our times. Because if people aren’t willing or able to think critically as they choose potential leaders, they’re apt to choose bad ones. And if they can’t judge whether the information they’re receiving is sound, they may follow faulty advice while ignoring recommendations that are science-based and solid (and perhaps life-saving).

Moreover, as a society, if we can’t think critically about the many serious challenges we face, it becomes more difficult to agree on what those challenges are—much less solve them.

On a personal level, critical thinking can enable you to make better everyday decisions. It can help you make sense of an increasingly complex and confusing world.

In the new expanded edition of my book A More Beautiful Question ( AMBQ ), I took a deep dive into critical thinking. Here are a few key things I learned.

First off, before you can get better at critical thinking, you should understand what it is. It’s not just about being a skeptic. When thinking critically, we are thoughtfully reasoning, evaluating, and making decisions based on evidence and logic. And—perhaps most important—while doing this, a critical thinker always strives to be open-minded and fair-minded . That’s not easy: It demands that you constantly question your assumptions and biases and that you always remain open to considering opposing views.

In today’s polarized environment, many people think of themselves as critical thinkers simply because they ask skeptical questions—often directed at, say, certain government policies or ideas espoused by those on the “other side” of the political divide. The problem is, they may not be asking these questions with an open mind or a willingness to fairly consider opposing views.

When people do this, they’re engaging in “weak-sense critical thinking”—a term popularized by the late Richard Paul, a co-founder of The Foundation for Critical Thinking . “Weak-sense critical thinking” means applying the tools and practices of critical thinking—questioning, investigating, evaluating—but with the sole purpose of confirming one’s own bias or serving an agenda.

In AMBQ , I lay out a series of questions you can ask yourself to try to ensure that you’re thinking critically. Here are some of the questions to consider:

  • Why do I believe what I believe?
  • Are my views based on evidence?
  • Have I fairly and thoughtfully considered differing viewpoints?
  • Am I truly open to changing my mind?

Of course, becoming a better critical thinker is not as simple as just asking yourself a few questions. Critical thinking is a habit of mind that must be developed and strengthened over time. In effect, you must train yourself to think in a manner that is more effortful, aware, grounded, and balanced.

For those interested in giving themselves a crash course in critical thinking—something I did myself, as I was working on my book—I thought it might be helpful to share a list of some of the books that have shaped my own thinking on this subject. As a self-interested author, I naturally would suggest that you start with the new 10th-anniversary edition of A More Beautiful Question , but beyond that, here are the top eight critical-thinking books I’d recommend.

The Demon-Haunted World: Science as a Candle in the Dark , by Carl Sagan

This book simply must top the list, because the late scientist and author Carl Sagan continues to be such a bright shining light in the critical thinking universe. Chapter 12 includes the details on Sagan’s famous “baloney detection kit,” a collection of lessons and tips on how to deal with bogus arguments and logical fallacies.

problem bias critical thinking

Clear Thinking: Turning Ordinary Moments Into Extraordinary Results , by Shane Parrish

The creator of the Farnham Street website and host of the “Knowledge Project” podcast explains how to contend with biases and unconscious reactions so you can make better everyday decisions. It contains insights from many of the brilliant thinkers Shane has studied.

Good Thinking: Why Flawed Logic Puts Us All at Risk and How Critical Thinking Can Save the World , by David Robert Grimes

A brilliant, comprehensive 2021 book on critical thinking that, to my mind, hasn’t received nearly enough attention . The scientist Grimes dissects bad thinking, shows why it persists, and offers the tools to defeat it.

Think Again: The Power of Knowing What You Don't Know , by Adam Grant

Intellectual humility—being willing to admit that you might be wrong—is what this book is primarily about. But Adam, the renowned Wharton psychology professor and bestselling author, takes the reader on a mind-opening journey with colorful stories and characters.

Think Like a Detective: A Kid's Guide to Critical Thinking , by David Pakman

The popular YouTuber and podcast host Pakman—normally known for talking politics —has written a terrific primer on critical thinking for children. The illustrated book presents critical thinking as a “superpower” that enables kids to unlock mysteries and dig for truth. (I also recommend Pakman’s second kids’ book called Think Like a Scientist .)

Rationality: What It Is, Why It Seems Scarce, Why It Matters , by Steven Pinker

The Harvard psychology professor Pinker tackles conspiracy theories head-on but also explores concepts involving risk/reward, probability and randomness, and correlation/causation. And if that strikes you as daunting, be assured that Pinker makes it lively and accessible.

How Minds Change: The Surprising Science of Belief, Opinion and Persuasion , by David McRaney

David is a science writer who hosts the popular podcast “You Are Not So Smart” (and his ideas are featured in A More Beautiful Question ). His well-written book looks at ways you can actually get through to people who see the world very differently than you (hint: bludgeoning them with facts definitely won’t work).

A Healthy Democracy's Best Hope: Building the Critical Thinking Habit , by M Neil Browne and Chelsea Kulhanek

Neil Browne, author of the seminal Asking the Right Questions: A Guide to Critical Thinking, has been a pioneer in presenting critical thinking as a question-based approach to making sense of the world around us. His newest book, co-authored with Chelsea Kulhanek, breaks down critical thinking into “11 explosive questions”—including the “priors question” (which challenges us to question assumptions), the “evidence question” (focusing on how to evaluate and weigh evidence), and the “humility question” (which reminds us that a critical thinker must be humble enough to consider the possibility of being wrong).

Warren Berger

Warren Berger is a longtime journalist and author of A More Beautiful Question .

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience
  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

problem bias critical thinking

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

problem bias critical thinking

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Social Sci LibreTexts

7.4: Critical Thinking

  • Last updated
  • Save as PDF
  • Page ID 66140

Questions to consider:

  • How can determining the situation help you think critically?
  • How do you present informed, unbiased thinking?
  • What is the difference between factual arguments and opinions?

Critical thinking has become a buzz phrase in education and corporate environments in recent years. The definitions vary slightly, but most agree that thinking critically includes some form of judgement that thinkers generate after careful analysis of the perspectives, opinions, or experimental results present for a particular problem or situation. Before you wonder if you’re even capable of critical thinking, consider that you think critically every day. When you grab an unwashed T-shirt off the top of the pile on the floor of your bedroom to wear into class but then suddenly remember that you may see the person of your dreams on that route, you may change into something a bit less disheveled. That’s thinking critically—you used data (the memory that your potential soul mate walks the same route you use on that day on campus) to change a sartorial decision (dirty shirt for clean shirt), and you will validate your thinking if and when you do have a successful encounter with said soul mate.

Likewise, when you decide to make your lunch rather than just grabbing a bag of chips, you’re thinking critically. You have to plan ahead, buy the food, possibly prepare it, arrange to and carry the lunch with you, and you may have various reasons for doing that—making healthier eating choices, saving money for an upcoming trip, or wanting more quiet time to unwind instead of waiting in a crowded lunch line. You are constantly weighing options, consulting data, gathering opinions, making choices, and then evaluating those decisions, which is a general definition of critical thinking.

Consider the following situations and how each one demands your thinking attention. Which do you find most demanding of critical thinking? Why?

  • Participating in competitive athletic events
  • Watching competitive athletic events
  • Reading a novel for pleasure
  • Reading a textbook passage in science

Critical thinking forces you to determine the actual situation under question and to determine your thoughts and actions around that situation.

Determining the Problem

One component to keep in mind to guide your critical thinking is to determine the situation. What problem are you solving? When problems become complex and multifaceted, it is easy to be distracted by the simple parts that may not need as much thinking to resolve but also may not contribute as much to the ultimate problem resolution. What aspect of the situation truly needs your attention and your critical thinking?

Imagine you’re planning a fantasy vacation as a group assignment in a class you’re taking where each person is allowed only $200. The group doles out specific preliminary tasks to each member to decide where to go, what sort of trip to take, and how to keep costs low, all in the name of a fun fantasy vacation. In this scenario, whose plan demonstrates the most effective critical thinking?

  • DeRhonda creates an elaborate invitation for a dinner party she’ll coordinate at an exclusive mountain cabin.
  • Patrick researches cruises, cabin rentals, and staycation options, considering costs for various trip lengths.
  • Rodrigio puts down a deposit for a private dining room for 25 at an expensive local restaurant for a date six weeks from the end of the semester.

Write out what each person’s thinking reflects about their expectations for this trip and why their actions may or may not help the group at this stage of the planning.

Critical thinking differs according to the subject you’re thinking about, and as such it can be difficult to pin down any sort of formula to make sure you are doing a good job of thinking critically in all situations. While you may need to adapt this list of critical thinking components, you can get started if you do the following:

  • Question everything
  • Conduct legitimate research
  • Limit your assumptions
  • Recognize your own biases
  • Gather and weigh all options

Additionally, you must recognize that changes will occur and may alter your conclusions now and in the future. You may eventually have to revisit an issue you effectively resolved previously and adapt to changing conditions. Knowing when to do that is another example of critical thinking. Informed flexibility, or knowing that parts of the plan may need to change and how those changes can work into the overall goal, is also a recognized element of thinking critically.

For example, early in the 20th century, many people considered cigarette smoking a relaxing social pastime that didn’t have many negative consequences. Some people may still consider smoking a way to relax; however, years of medical research have proven with mounting evidence that smoking causes cancer and exacerbates numerous other medical conditions. Researchers asked questions about the impact of smoking on people’s overall health, conducted regulated experiments, tracked smokers’ reactions, and concluded that smoking did impact health. Over time, attitudes, evidence, and opinions change, and as a critical thinker, you must continue to research, synthesize newly discovered evidence, and adapt to that new information.

fig-ch01_patchfile_01.jpg

Defending against Bias

Once you have all your information gathered and you have checked your sources for currency and validity, you need to direct your attention to how you’re going to present your now well-informed analysis. Be careful on this step to recognize your own possible biases. Facts are verifiable; opinions are beliefs without supporting evidence. Stating an opinion is just that. You could say “Blue is the best color,” and that’s your opinion. If you were to conduct research and find evidence to support this claim, you could say, “Researchers at Oxford University recognize that the use of blue paint in mental hospitals reduces heart rates by 25% and contributes to fewer angry outbursts from patients.” This would be an informed analysis with credible evidence to support the claim.

Not everyone will accept your analysis, which can be frustrating. Most people resist change and have firm beliefs on both important issues and less significant preferences. With all the competing information surfacing online, on the news, and in general conversation, you can understand how confusing it can be to make any decisions. Look at all the reliable, valid sources that claim different approaches to be the best diet for healthy living: ketogenic, low-carb, vegan, vegetarian, high fat, raw foods, paleo, Mediterranean, etc. All you can do in this sort of situation is conduct your own serious research, check your sources, and write clearly and concisely to provide your analysis of the information for consideration. You cannot force others to accept your stance, but you can show your evidence in support of your thinking, being as persuasive as possible without lapsing into your own personal biases. Then the rest is up to the person reading or viewing your analysis.

Factual Arguments vs. Opinions

Thinking and constructing analyses based on your thinking will bring you in contact with a great deal of information. Some of that information will be factual, and some will not be. You need to be able to distinguish between facts and opinions so you know how to support your arguments. Begin with basic definitions:

  • Fact: a statement that is true and backed up with evidence; facts can be verified through observation or research
  • Opinion: a statement someone holds to be true without supporting evidence; opinions express beliefs, assumptions, perceptions, or judgements

Of course, the tricky part is that most people do not label statements as fact and opinion, so you need to be aware and recognize the difference as you go about honing your critical thinking skills.

You probably have heard the old saying “Everyone is entitled to their own opinions,” which may be true, but conversely, not everyone is entitled to their own facts. Facts are true for everyone, not just those who want to believe in them. For example, mice are animals is a fact; mice make the best pets is an opinion.

Determine if the following statements are facts or opinions based on just the information provided here, referring to the basic definitions above. Some people consider scientific findings to be opinions even when they are convincingly backed by reputable evidence and experimentation. However, remember the definition of fact —verifiable by research or observation. Think about what other research you may have to conduct to make an informed decision.

  • Oregon is a state in the United States. (How would this be proven?)
  • Beef is made from cattle. (See current legislation concerning vegetarian “burgers.”)
  • Increased street lighting decreases criminal behavior. (What information would you need to validate this claim?)
  • In 1952, Elizabeth became Queen of England. (What documents could validate this?)
  • Oatmeal tastes plain. (What factors might play into this claim?)
  • Acne is an embarrassing skin condition. (Who might verify this claim?)
  • Kindergarten decreases student dropout rates. (Think of different interest groups that may take sides on this issue.)
  • Carbohydrates promote weight gain. (Can you determine if this is a valid statement?)
  • Cell phones cause brain tumors. (What research considers this claim?)
  • Immigration is good for the US economy. (What research would help you make an informed decision on this topic?)

Many people become very attached to their opinions, even stating them as facts despite the lack of verifiable evidence. Think about political campaigns, sporting rivalries, musical preferences, and religious or philosophical beliefs. When you are reading, writing, and thinking critically, you must be on the lookout for sophisticated opinions others may present as factual information. While it’s possible to be polite when questioning another person's opinions when engaging in intellectual debate, thinking critically requires that you do conduct this questioning.

For instance, someone may say or write that a particular political party should move its offices to different cities every year—that’s an opinion regardless of whether you side with one party or the other. If, on the other hand, the same person said that one political party is headquartered in a specific city, that is a fact you can verify. You could find sources that can validate or discredit the statement. Even if the city the person lists as the party headquarters is incorrect, the statement itself is still a fact—just an erroneous one. If you use biased and opinionated information or even incorrect facts as your evidence to support your factual arguments, then you have not validated your sources or checked your facts well enough. At this point, you would need to keep researching.

Back Home

  • Search Search Search …
  • Search Search …

Critical Thinking Models: A Comprehensive Guide for Effective Decision Making

Critical Thinking Models

Critical thinking models are valuable frameworks that help individuals develop and enhance their critical thinking skills . These models provide a structured approach to problem-solving and decision-making by encouraging the evaluation of information and arguments in a logical, systematic manner. By understanding and applying these models, one can learn to make well-reasoned judgments and decisions.

problem bias critical thinking

Various critical thinking models exist, each catering to different contexts and scenarios. These models offer a step-by-step method to analyze situations, scrutinize assumptions and biases, and consider alternative perspectives. Ultimately, the goal of critical thinking models is to enhance an individual’s ability to think critically, ultimately improving their reasoning and decision-making skills in both personal and professional settings.

Key Takeaways

  • Critical thinking models provide structured approaches for enhancing decision-making abilities
  • These models help individuals analyze situations, scrutinize assumptions, and consider alternative perspectives
  • The application of critical thinking models can significantly improve one’s reasoning and judgment skills.

Fundamentals of Critical Thinking

problem bias critical thinking

Definition and Importance

Critical thinking is the intellectual process of logically, objectively, and systematically evaluating information to form reasoned judgments, utilizing reasoning , logic , and evidence . It involves:

  • Identifying and questioning assumptions,
  • Applying consistent principles and criteria,
  • Analyzing and synthesizing information,
  • Drawing conclusions based on evidence.

The importance of critical thinking lies in its ability to help individuals make informed decisions, solve complex problems, and differentiate between true and false beliefs .

Core Cognitive Skills

Several core cognitive skills underpin critical thinking:

  • Analysis : Breaking down complex information into smaller components to identify patterns or inconsistencies.
  • Evaluation : Assessing the credibility and relevance of sources, arguments, and evidence.
  • Inference : Drawing conclusions by connecting the dots between analyzed information.
  • Synthesis : Incorporating analyzed information into a broader understanding and constructing one’s argument.
  • Logic and reasoning : Applying principles of logic to determine the validity of arguments and weigh evidence.

These skills enable individuals to consistently apply intellectual standards in their thought process, which ultimately results in sound judgments and informed decisions.

Influence of Cognitive Biases

A key aspect of critical thinking is recognizing and mitigating the impact of cognitive biases on our thought processes. Cognitive biases are cognitive shortcuts or heuristics that can lead to flawed reasoning and distort our understanding of a situation. Examples of cognitive biases include confirmation bias, anchoring bias, and availability heuristic.

To counter the influence of cognitive biases, critical thinkers must be aware of their own assumptions and strive to apply consistent and objective evaluation criteria in their thinking process. The practice of actively recognizing and addressing cognitive biases promotes an unbiased and rational approach to problem-solving and decision-making.

The Critical Thinking Process

problem bias critical thinking

Stages of Critical Thinking

The critical thinking process starts with gathering and evaluating data . This stage involves identifying relevant information and ensuring it is credible and reliable. Next, an individual engages in analysis by examining the data closely to understand its context and interpret its meaning. This step can involve breaking down complex ideas into simpler components for better understanding.

The next stage focuses on determining the quality of the arguments, concepts, and theories present in the analyzed data. Critical thinkers question the credibility and logic behind the information while also considering their own biases and assumptions. They apply consistent standards when evaluating sources, which helps them identify any weaknesses in the arguments.

Values play a significant role in the critical thinking process. Critical thinkers assess the significance of moral, ethical, or cultural values shaping the issue, argument, or decision at hand. They determine whether these values align with the evidence and logic they have analyzed.

After thorough analysis and evaluation, critical thinkers draw conclusions based on the evidence and reasoning gathered. This step includes synthesizing the information and presenting a clear, concise argument or decision. It also involves explaining the reasoning behind the conclusion to ensure it is well-founded.

Application in Decision Making

In decision making, critical thinking is a vital skill that allows individuals to make informed choices. It enables them to:

  • Analyze options and their potential consequences
  • Evaluate the credibility of sources and the quality of information
  • Identify biases, assumptions, and values that may influence the decision
  • Construct a reasoned, well-justified conclusion

By using critical thinking in decision making, individuals can make more sound, objective choices. The process helps them to avoid pitfalls like jumping to conclusions, being influenced by biases, or basing decisions on unreliable data. The result is more thoughtful, carefully-considered decisions leading to higher quality outcomes.

Critical Thinking Models

Critical thinking models are frameworks that help individuals develop better problem-solving and decision-making abilities. They provide strategies for analyzing, evaluating, and synthesizing information to reach well-founded conclusions. This section will discuss four notable models: The RED Model, Bloom’s Taxonomy, Paul-Elder Model, and The Halpern Critical Thinking Assessment.

The RED Model

The RED Model stands for Recognize Assumptions, Evaluate Arguments, and Draw Conclusions. It emphasizes the importance of questioning assumptions, weighing evidence, and reaching logical conclusions.

  • Recognize Assumptions: Identify and challenge assumptions that underlie statements, beliefs, or arguments.
  • Evaluate Arguments: Assess the validity and reliability of evidence to support or refute claims.
  • Draw Conclusions: Make well-reasoned decisions based on available information and sound reasoning.

The RED Model helps individuals become more effective problem solvers and decision-makers by guiding them through the critical thinking process ^(source) .

Bloom’s Taxonomy

Bloom’s Taxonomy is a hierarchical model that classifies cognitive skills into six levels of complexity. These levels are remembering, understanding, applying, analyzing, evaluating, and creating. By progressing through these levels, individuals can develop higher-order thinking skills.

  • Remembering: Recall information or facts.
  • Understanding: Comprehend the meaning of ideas, facts, or problems.
  • Applying: Use knowledge in different situations.
  • Analyzing: Break down complex topics or problems into sub-parts.
  • Evaluating: Assess the quality, relevance, or credibility of information, ideas, or solutions.
  • Creating: Combine elements to form a new whole, generate new ideas, or solve complex issues.

Paul-Elder Model

The Paul-Elder Model introduces the concept of “elements of thought,” focusing on a structured approach to critical thinking. This model promotes intellectual standards, such as clarity, accuracy, and relevance. It consists of three stages:

  • Critical Thinking: Employ the intellectual standards to problem-solving and decision-making processes.
  • Elements of Thought: Consider purpose, question at issue, information, interpretation and inference, concepts, assumptions, implications, and point of view.
  • Intellectual Traits: Develop intellectual traits, such as intellectual humility, intellectual empathy, and intellectual perseverance.

This model fosters a deeper understanding and appreciation of critical thinking ^(source) .

The Halpern Critical Thinking Assessment

The Halpern Critical Thinking Assessment is a standardized test developed by Diane Halpern to assess critical thinking skills. The evaluation uses a variety of tasks to measure abilities in core skill areas, such as verbal reasoning, argument analysis, and decision making. Pearson, a leading publisher of educational assessments, offers this test as a means to assess individuals’ critical thinking skills ^(source) .

These four critical thinking models can be used as frameworks to improve and enhance cognitive abilities. By learning and practicing these models, individuals can become better equipped to analyze complex information, evaluate options, and make well-informed decisions.

Evaluating Information and Arguments

In this section, we will discuss the importance of evaluating information and arguments in the process of critical thinking, focusing on evidence assessment, logic and fallacies, and argument analysis.

Evidence Assessment

Evaluating the relevance, accuracy, and credibility of information is a vital aspect of critical thinking. In the process of evidence assessment, a thinker should consider the following factors:

  • Source reliability : Research and understand the expertise and credibility of the source to ensure that biased or inaccurate information is not being considered.
  • Currency : Check the date of the information to make sure it is still relevant and accurate in the present context.
  • Objectivity : Analyze the information for potential bias and always cross-reference it with other credible sources.

When practicing critical thinking skills, it is essential to be aware of your own biases and make efforts to minimize their influence on your decision-making process.

Logic and Fallacies

Logic is crucial for deconstructing and analyzing complex arguments, while identifying and avoiding logical fallacies helps maintain accurate and valid conclusions. Some common fallacies to watch out for in critical thinking include:

  • Ad Hominem : Attacking the person making the argument instead of addressing the argument itself.
  • Strawman : Misrepresenting an opponent’s argument to make it easier to refute.
  • False Dilemma : Presenting only two options when there may be multiple viable alternatives.
  • Appeal to Authority : Assuming a claim is true simply because an authority figure supports it.

Being aware of these fallacies enables a thinker to effectively evaluate the strength of an argument and make sound judgments accordingly.

Argument Analysis

Analyzing an argument is the process of evaluating its structure, premises, and conclusion while determining its validity and soundness. To analyze an argument, follow these steps:

  • Identify the premises and conclusion : Determine the main point is being argued, how it is related and substance of the argument.
  • Evaluate the validity : Assess whether the conclusion logically follows from the premises and if the argument’s structure is sound.
  • Test the soundness : Evaluate the truth and relevance of the premises. This may require verifying the accuracy of facts and evidence, as well as assessing the reliability of sources.
  • Consider counter-arguments : Identify opposing viewpoints and counter-arguments, and evaluate their credibility to gauge the overall strength of the original argument.

By effectively evaluating information and arguments, critical thinkers develop a solid foundation for making well-informed decisions and solving problems.

Enhancing Critical Thinking

Strategies for improvement.

To enhance critical thinking, individuals can practice different strategies, including asking thought-provoking questions, analyzing ideas and observations, and being open to different perspectives. One effective technique is the Critical Thinking Roadmap , which breaks critical thinking down into four measurable phases: execute, synthesize, recommend, and communicate. It’s important to use deliberate practice in these areas to develop a strong foundation for problem-solving and decision-making. In addition, cultivating a mindset of courage , fair-mindedness , and empathy will support critical thinking development.

Critical Thinking in Education

In the field of education, critical thinking is an essential component of effective learning and pedagogy. Integrating critical thinking into the curriculum encourages student autonomy, fosters innovation, and improves student outcomes. Teachers can use various approaches to promote critical thinking, such as:

  • Employing open-ended questions to stimulate ideas
  • Incorporating group discussions or debates to facilitate communication and evaluation of viewpoints
  • Assessing and providing feedback on student work to encourage reflection and improvement
  • Utilizing real-world scenarios and case studies for practical application of concepts

Developing a Critical Thinking Mindset

To truly enhance critical thinking abilities, it’s important to adopt a mindset that values integrity , autonomy , and empathy . These qualities help to create a learning environment that encourages open-mindedness, which is key to critical thinking development. To foster a critical thinking mindset:

  • Be curious : Remain open to new ideas and ask questions to gain a deeper understanding.
  • Communicate effectively : Clearly convey thoughts and actively listen to others.
  • Reflect and assess : Regularly evaluate personal beliefs and assumptions to promote growth.
  • Embrace diversity of thought : Welcome different viewpoints and ideas to foster innovation.

Incorporating these approaches can lead to a more robust critical thinking skillset, allowing individuals to better navigate and solve complex problems.

Critical Thinking in Various Contexts

The workplace and beyond.

Critical thinking is a highly valued skill in the workplace, as it enables employees to analyze situations, make informed decisions, and solve problems effectively. It involves a careful thinking process directed towards a specific goal. Employers often seek individuals who possess strong critical thinking abilities, as they can add significant value to the organization.

In the workplace context, critical thinkers are able to recognize assumptions, evaluate arguments, and draw conclusions, following models such as the RED model . They can also adapt their thinking to suit various scenarios, allowing them to tackle complex and diverse problems.

Moreover, critical thinking transcends the workplace and applies to various aspects of life. It empowers an individual to make better decisions, analyze conflicting information, and engage in constructive debates.

Creative and Lateral Thinking

Critical thinking encompasses both creative and lateral thinking. Creative thinking involves generating novel ideas and solutions to problems, while lateral thinking entails looking at problems from different angles to find unique and innovative solutions.

Creative thinking allows thinkers to:

  • Devise new concepts and ideas
  • Challenge conventional wisdom
  • Build on existing knowledge to generate innovative solutions

Lateral thinking, on the other hand, encourages thinkers to:

  • Break free from traditional thought patterns
  • Combine seemingly unrelated ideas to create unique solutions
  • Utilize intuition and intelligence to approach problems from a different perspective

Both creative and lateral thinking are essential components of critical thinking, allowing individuals to view problems in a holistic manner and generate well-rounded solutions. These skills are highly valued by employers and can lead to significant personal and professional growth.

In conclusion, critical thinking is a multifaceted skill that comprises various thought processes, including creative and lateral thinking. By embracing these skills, individuals can excel in the workplace and in their personal lives, making better decisions and solving problems effectively.

Overcoming Challenges

Recognizing and addressing bias.

Cognitive biases and thinking biases can significantly affect the process of critical thinking . One of the key components of overcoming these challenges is to recognize and address them. It is essential to be aware of one’s own beliefs, as well as the beliefs of others, to ensure fairness and clarity throughout the decision-making process. To identify and tackle biases, one can follow these steps:

  • Be self-aware : Understand personal beliefs and biases, acknowledging that they may influence the interpretation of information.
  • Embrace diverse perspectives : Encourage open discussions and invite different viewpoints to challenge assumptions and foster cognitive diversity.
  • Reevaluate evidence : Continuously reassess the relevance and validity of the information being considered.

By adopting these practices, individuals can minimize the impact of biases and enhance the overall quality of their critical thinking skills.

Dealing with Information Overload

In today’s world, information is abundant, and it can become increasingly difficult to demystify and make sense of the available data. Dealing with information overload is a crucial aspect of critical thinking. Here are some strategies to address this challenge:

  • Prioritize information : Focus on the most relevant and reliable data, filtering out unnecessary details.
  • Organize data : Use tables, charts, and lists to categorize information and identify patterns more efficiently.
  • Break down complex information : Divide complex data into smaller, manageable segments to simplify interpretation and inferences.

By implementing these techniques, individuals can effectively manage information overload, enabling them to process and analyze data more effectively, leading to better decision-making.

In conclusion, overcoming challenges such as biases and information overload is essential in the pursuit of effective critical thinking. By recognizing and addressing these obstacles, individuals can develop clarity and fairness in their thought processes, leading to well-informed decisions and improved problem-solving capabilities.

Measuring Critical Thinking

Assessment tools and criteria.

There are several assessment tools designed to measure critical thinking, each focusing on different aspects such as quality, depth, breadth, and significance of thinking. One example of a widely used standardized test is the Watson-Glaser Critical Thinking Appraisal , which evaluates an individual’s ability to interpret information, draw conclusions, and make assumptions. Another test is the Cornell Critical Thinking Tests Level X and Level Z , which assess an individual’s critical thinking skills through multiple-choice questions.

Furthermore, criteria for assessing critical thinking often include precision, relevance, and the ability to gather and analyze relevant information. Some assessors utilize the Halpern Critical Thinking Assessment , which measures the application of cognitive skills such as deduction, observation, and induction in real-world scenarios.

The Role of IQ and Tests

It’s important to note that intelligence quotient (IQ) tests and critical thinking assessments are not the same. While IQ tests aim to measure an individual’s cognitive abilities and general intelligence, critical thinking tests focus specifically on one’s ability to analyze, evaluate, and form well-founded opinions. Therefore, having a high IQ does not necessarily guarantee strong critical thinking skills, as critical thinking requires additional mental processes beyond basic logical reasoning.

To build and enhance critical thinking skills, individuals should practice and develop higher-order thinking, such as critical alertness, critical reflection, and critical analysis. Using a Critical Thinking Roadmap , such as the four-phase framework that includes execution, synthesis, recommendation, and the ability to apply, individuals can continuously work to improve their critical thinking abilities.

Frequently Asked Questions

What are the main steps involved in the paul-elder critical thinking model.

The Paul-Elder Critical Thinking Model is a comprehensive framework for developing critical thinking skills. The main steps include: identifying the purpose, formulating questions, gathering information, identifying assumptions, interpreting information, and evaluating arguments. The model emphasizes clarity, accuracy, precision, relevance, depth, breadth, logic, and fairness throughout the critical thinking process. By following these steps, individuals can efficiently analyze and evaluate complex ideas and issues.

Can you list five techniques to enhance critical thinking skills?

Here are five techniques to help enhance critical thinking skills:

  • Ask open-ended questions : Encourages exploration and challenges assumptions.
  • Engage in active listening: Focus on understanding others’ viewpoints before responding.
  • Reflect on personal biases: Identify and question any preconceived notions or judgments.
  • Practice mindfulness: Develop self-awareness and stay present in the moment.
  • Collaborate with others: Exchange ideas and learn from diverse perspectives.

What is the RED Model of critical thinking and how is it applied?

The RED Model of critical thinking consists of three key components: Recognize Assumptions, Evaluate Arguments, and Draw Conclusions. To apply the RED Model, begin by recognizing and questioning underlying assumptions, being aware of personal biases and stereotypes. Next, evaluate the strengths and weaknesses of different arguments, considering evidence, logical consistency, and alternative explanations. Lastly, draw well-reasoned conclusions that are based on the analysis and evaluation of the information gathered.

How do the ‘3 C’s’ of critical thinking contribute to effective problem-solving?

The ‘3 C’s’ of critical thinking – Curiosity, Creativity, and Criticism – collectively contribute to effective problem-solving. Curiosity allows individuals to explore various perspectives and ask thought-provoking questions, while Creativity helps develop innovative solutions and unique approaches to challenges. Criticism, or the ability to evaluate and analyze ideas objectively, ensures that the problem-solving process remains grounded in logic and relevance.

What characteristics distinguish critical thinking from creative thinking?

Critical thinking and creative thinking are two complementary cognitive skills. Critical thinking primarily focuses on analyzing, evaluating, and reasoning, using objectivity and logical thinking. It involves identifying problems, assessing evidence, and drawing sound conclusions. Creative thinking, on the other hand, is characterized by the generation of new ideas, concepts, and approaches to solve problems, often involving imagination, originality, and out-of-the-box thinking.

What are some recommended books to help improve problem-solving and critical thinking skills?

There are several books that can help enhance problem-solving and critical thinking skills, including:

  • “Thinking, Fast and Slow” by Daniel Kahneman: This book explores the dual process theory of decision-making and reasoning.
  • “The 5 Elements of Effective Thinking” by Edward B. Burger and Michael Starbird: Offers practical tips and strategies for improving critical thinking skills.
  • “Critique of Pure Reason” by Immanuel Kant: A classic philosophical work that delves into the principles of reason and cognition.
  • “Mindware: Tools for Smart Thinking” by Richard E. Nisbett: Presents a range of cognitive tools to enhance critical thinking and decision-making abilities.
  • “The Art of Thinking Clearly” by Rolf Dobelli: Explores common cognitive biases and errors in judgment that can affect critical thinking.

You may also like

board games to improve your critical thinking skills

5 Board Games to Develop Critical Thinking Skills

Do you know why board games are called that way? It’s because you only play them when you’re bored. Lousy puns aside, […]

how to prepare for a critical thinking test

How to Prepare for a Critical Thinking Test: Effective Strategies and Tips

Preparing for a critical thinking test can be challenging, as it requires you to use your intellectual skills to critically analyze evidence […]

communication and critical thinking

Critical Thinking and Effective Communication: Enhancing Interpersonal Skills for Success

In today’s fast-paced world, effective communication and critical thinking have become increasingly important skills for both personal and professional success. Critical thinking […]

47 Critical Thinking Questions for High School Students

47 Critical Thinking Questions for High School Students

Critical thinking is defined as analyzing and thinking objectively about an issue to form a judgment. Critical thinking skills are important for […]

2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

Learning objectives.

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

Connections

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Confirmation bias.

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk cost fallacy.

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Think Like a Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Authors: Nathan Smith
  • Publisher/website: OpenStax
  • Book title: Introduction to Philosophy
  • Publication date: Jun 15, 2022
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Section URL: https://openstax.org/books/introduction-philosophy/pages/2-2-overcoming-cognitive-biases-and-engaging-in-critical-reflection

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

Bias and Critical Thinking

Note: The German version of this entry can be found here: Bias and Critical Thinking (German)

Note: This entry revolves more generally around Bias in science. For more thoughts on Bias and its relation to statistics, please refer to the entry on Bias in statistics .

In short: This entry discusses why science is never objective, and what we can really know.

  • 1 What is bias?
  • 2 Design criteria
  • 3 Bias in gathering data, analysing data and interpreting data
  • 4 Bias and philosophy
  • 5 Critical Theory and Bias
  • 6 Further Information

What is bias?

"The very concept of objective truth is fading out of the world." - George Orwell

A bias is “the action of supporting or opposing a particular person or thing in an unfair way, because of allowing personal opinions to influence your judgment” (Cambridge Dictionary). In other words, bias clouds our judgment and often action in the sense that we act wrongly. We are all biased, because we are individuals with individual experiences, and are unconnected from other individuals and/or groups, or at least think we are unconnected.

Recognising bias in research is highly relevant, because bias exposes the myth of objectivity of research and enables a better recognition and reflection of our flaws and errors. In addition, one could add that understanding bias in science is relevant beyond the empirical, since bias can also highlight flaws in our perceptions and actions as humans. To this end, acknowledging bias is understanding the limitations of oneself. Prominent examples are gender bias and racial bias, which are often rooted in our societies, and can be deeply buried in our subconscious. To be critical researchers it is our responsibility to learn about the diverse biases we have, yet it is beyond this text to explore the subjective human bias we need to overcome. Just so much about the ethics of bias: many would argue that overcoming our biases requires the ability to learn and question our privileges. Within research we need to recognise that science has been severely and continuously biased against ethnic minorities, women, and many other groups. Institutional and systemic bias are part of the current reality of the system, and we need to do our utmost to change this - there is a need for debiasing science, and our own actions. While it should not go unnoticed that institutions and systems did already change, injustices and inequalities still exist. Most research is conducted in the global north, posing a neo-colonialistic problem that we are far from solving. Much of academia is still far away from having a diverse understanding of people, and systemic and institutional discrimination are parts of our daily reality. We are on the path of a very long journey, and there is much to be done concerning bias in constructed institutions.

All this being said, let us shift our attention now to bias in empirical research. Here, we show three different perspective in order to enable a more reflexive understanding of bias. The first is the understanding how different forms of biases relate to design criteria of scientific methods. The second is the question which stage in the application of methods - data gathering, data analysis, and interpretation of results - is affected by which bias, and how. Finally, the third approach is to look at the three principal theories of Western philosophy, namely reason, social contract and utilitarianism - to try and dismantle which of the three can be related to which bias. Many methods are influenced by bias, and recognising which bias affects which design criteria, research stage and principal philosophical theory in the application of a method can help to make empirical research more reflexive.

problem bias critical thinking

Design criteria

While qualitative research is often considered prone to many biases, it is also often more reflexive in recognising its limitations. Many qualitative methods are defined by a strong subjective component - i.e. of the researcher - and a clear documentation can thus help to make an existing bias more transparent. Many quantitative approaches have a reflexive canon that focuses on specific biases relevant for a specific approach, such as sampling bias or reporting bias. These are often less considered than in qualitative methods, since quantitative methods are still – falsely - considered to be more objective. This is not true. While one could argue that the goal of reproducibility may lead to a better taming of a bias, this is not necessarily so, as the crisis in psychology clearly shows. Both quantitative and qualitative methods are potentially strongly affected by several cognitive biases, as well as by bias in academia in general, which includes for instance funding bias or the preference of open access articles. While all this is not surprising, it is still all the much harder to solve.

Another general differentiation can be made between inductive and deductive approaches. Many deductive approaches are affected by bias that is associated to sampling. Inductive approaches are more associated to bias during interpretation. Deductive approaches often build around designed experiments, while the strongpoint of inductive approaches is being less bound by methodological designs, which can also make bias more hidden and thus harder to detect. However, this is why qualitative approaches often have an emphasis on a concise documentation.

The connection between spatial scales and bias is rather straightforward, since the individual focus is related to cognitive bias, while system scales are more associated to prejudices, bias in academia and statistical bias. While the impact of temporal bias is less explored, forecast bias is a prominent example when it comes to future predictions, and another error is applying our cultural views and values on past humans, which has yet to be clearly named as a bias. What can be clearly said about both spatial and temporal scales is that we are often irrationally biased towards very distant entities - in space or time - and even irrationally more than we should be. We are for instance inclined to reject the importance of a distant future scenario, although it may widely follow the same odds to become a reality than a close future. For example, almost everybody would like to win the lottery tomorrow rather than win the lottery in 20 years, irrespective of your chances to live and see it happen, or the longer time you may spend with your lottery prize for the (longer) time to come. Humans are most peculiar constructed beings, and we are notorious to act irrationally. This is equally true for spatial distance. We may care irrationally more for people that are close to us as compared to people that are very distant, even independent of joined experience (e.g with friends) or joined history (e.g. with family). Again, this infers a bias which we can be aware of, but which has to be named. No doubt the current social developments will increase our capacities to recognise our biases even more, as all these phenomena also affect scientists.

The following table categorizes different types of Bias as indicated in the Wikipedia entry on Bias according to two levels of the Design Criteria of Methods .

Bias in gathering data, analysing data and interpreting data

The three steps of the application of a method are clearly worth investigating, as it allows us to dismantle at which stage we may inflict a bias into our application of a method. Gathering data is strongly associated with cognitive bias, yet also to statistical bias and partly even to some bias in academia. Bias associated to sampling can be linked to a subjective perspective as well as to systematic errors rooted in previous results. This can also affect the analysis of data, yet here one has to highlight that quantitative methods are less affected by a bias through analysis than qualitative methods. This is not a normative judgement, and can clearly be counter-measured by a sound documentation of the analytical steps. We should nevertheless not forget that there are even different assumptions about the steps of analysis in such an established field as statistics. Here, different schools of thought constantly clash regarding the optimal approach of analysis, sometimes even with different results. This exemplifies that methodological analysis can be quite normative, underlining the need for a critical perspective. This is also the case in qualitative methods, yet there it strongly depends on the specific methods, as these methods are more diverse. Concerning the interpretation of scientific results, the amount and diversity of biases is clearly the highest, or in other words, worst. While this is related to the cognition bias we have as individuals, it is also related to prejudices, bias in academia and statistical bias. Overall, we need to recognise that some methods are less associated to certain biases because they are more established concerning the norms of their application, while other methods are new and less tested by the academic community. When it comes to bias, there can be at least a weak effect that safety - although not diversity - concerning methods comes in numbers. More and diverse methods may offer new insights on biases, since one method may reveal a bias that another method cannot reveal. Methodological plurality may reduce bias. For a fully established method the understanding of its bias is often larger, because the number of times it has been applied is larger. This is especially but not always true for the analysis step, and in parts also for some methodological designs concerned with sampling. Clear documentation is however key to make bias more visible among the three stages.

Bias and philosophy

The last and by far most complex point is the root theories associated to bias. Reason, social contract and utilitarianism are the three key theories of Western philosophy relevant for empiricism, and all biases can be at least associated to one of these three foundational theories. Many cognitive bias are linked to reason or unreasonable behaviour. Much of bias relates to prejudices and society can be linked to the wide field of social contract. Lastly, some bias is clearly associated with utilitarianism. Surprisingly, utilitarianism is associated to a low amount of bias, yet it should be noted that the problem of causality within economical analysis is still up for debate. Much of economic management is rooted in correlative understandings, which are often mistaken for clear-cut causal relations. Psychology also clearly illustrates that investigating a bias is different from unconsciously inferring a bias into your research. Consciousness of bias is the basis for its recognition : if you are not aware of bias, you cannot take it into account regarding your knowledge production. While it thus seems not directly helpful to associate empirical research and its biases to the three general foundational theories of philosophy - reason, social contract and utilitatrianism -, we should still take this into account, least of all at it leads us to one of the most important developments of the 20th century: Critical Theory.

Critical Theory and Bias

Out of the growing empiricism of the enlightenment there grew a concern which we came to call Critical Theory. At the heart of critical theory is the focus on critiquing and changing society as a whole, in contrast to only observing or explaining it. Originating in Marx, Critical Theory consists of a clear distancing from previous theories in philosophy - or associated with the social - that try to understand or explain. By embedding society in its historical context (Horkheimer) and by focussing on a continuous and interchanging critique (Benjamin) Critical Theory is a first and bold step towards a more holistic perspective in science. Remembering the Greeks and also some Eastern thinkers, one could say it is the first step back to a holistic thinking. From a methodological perspective, Critical Theory is radical because it seeks to distinguish itself not only from previously existing philosophy, but more importantly from the widely dominating empiricism, and its societal as well as scientific consequences. A Critical Theory should thus be explanatory, practical and normative, and what makes it more challenging, it needs to be all these three things combined (Horkheimer). Through Habermas, Critical Theory got an embedding in democracy, yet with a critical view of what we could understand as globalisation and its complex realities. The reflexive empowerment of the individual is as much as a clear goal as one would expect, also because of the normative link to the political.

Critical Theory is thus a vital step towards a wider integration of diverse philosophies, but also from a methodological standpoint it is essential since it allowed for the emergence of a true and holistic critique of everything empirical. While this may be valued as an attack, is can also be interpreted as necessary step, since the arrogance and the claim of truth in empiricism can be interpreted not only as a deep danger to methods. Popper does not offer a true solution to positivism, and indeed he was very much hyped by many. His thought that the holy grail of knowledge can ultimately be never truly reached also generates certain problems. He can still be admired because he called for scientists to be radical, while acknowledging that most scientists are not radical. In addition, we could see it from a post-modernist perspective as a necessary step to prevent an influence of empiricism that might pose a threat to and by humankind itself, may it be through nuclear destruction, the unachievable and feeble goal of a growth economy (my wording), the naive and technocratic hoax of the eco modernists (also my wording) or any other paradigm that is short-sighted or naive. In other words, we look at the postmodern.

Critical Theory to this end is now developing to connect to other facets of the discourse, and some may argue that its focus onto the social science can be seen critical in itself, or at least as a normative choice that is clearly anthropocentric, has a problematic relationship with the empirical, and has mixed relations with its diverse offspring that includes gender research, critique of globalisation, and many other normative domains that are increasingly explored today. Building on the three worlds of Popper (the physical world, the mind world, human knowledge), we should note another possibility, that is Critical Realism. Roy Bhaskar proposed three ontological domains ( strata of knowledge ): the real (which is everything there is ), the actual ( everything we can grasp ), and the empirical ( everything we can observe ). During the last decade, humankind unlocked ever more strata of knowledge, hence much of the actual became empirical to us. We have to acknowledge that some strata of knowledge are hard to relate, or may even be unrelatable, which has consequences for our methodological understanding of the world. Some methods may unlock some strata of knowledge but not others. Some may be specific, some vague. And some may only unlock new strata based on a novel combinations. What is most relevant to this end is however, that we might look for causal links, but need to be critical that new strata of knowledge may make them obsolete. Consequently, there are no universal laws that we can thrive for, but instead endless strata to explore.

Coming back to bias, Critical Theory seems as an antidote to bias , and some may argue Critical Realism even more so, as it combines the criticality with a certain humbleness necessary when exploring the empirical and causal. The explanatory characteristic allowed by Critical Realism might be good enough for the pragmatist, the practical may speak to the modern engagement of science with and for society, and the normative is aware of – well - all things normative, including the critical. Hence a door was opened to a new mode of science, focussing on the situation and locatedness of research within the world. This was surely a head start with Kant, who opened the globe to the world of methods. There is however a critical link in Habermas, who highlighted the duality of the rational individual on a small scale and the role of global societies as part of the economy (Habermas 1987). This underlines a crucial link to the original three foundational theories in philosophy, albeit in a dramatic and focused interpretation of modernity. Habermas himself was well aware of the tensions between these two approaches – the critical and the empirical -, yet we owe it to Critical Theory and its continuations that a practical and reflexive knowledge production can be conducted within deeply normative systems such as modern democracies.

Linking to the historical development of methods, we can thus clearly claim that Critical Theory (and Critical Realism) opened a new domain or mode of thinking, and its impact can be widely felt way beyond the social science and philosophy that it affected directly. However, coming back to bias, the answer to an almost universal rejection of empiricism will not be followed here . Instead, we need to come back to the three foundational theories of philosophy, and need to acknowledge that reason, social contract and utilitarianism are the foundation of the first empirical disciplines that are at their core normative (e.g. psychology, social and political science, and economics). Since bias can be partly related to these three theories, and consequentially to specific empirical disciplines, we need to recognise that there is an overarching methodological bias. This methodological bias has a signature rooted in specific design criteria, which are in turn related to specific disciplines. Consequently, this methodological bias is a disciplinary bias - even more so, since methods may be shared among scientific disciplines, but most disciplines claim either priority or superiority when it comes to the ownership of a method.

The disciplinary bias of modern science thus creates a deeply normative methodological bias, which some disciplines may try to take into account yet others clearly not. In other words, the dogmatic selection of methods within disciplines has the potential to create deep flaws in empirical research, and we need to be aware and reflexive about this. The largest bias concerning methods is the choice of methods per se. A critical perspective is thus not only of relevance from a perspective of societal responsibility, but equally from a view on the empirical. Clear documentation and reproducibility of research are important but limited stepping stones in a critique of the methodological. This cannot replace a critical perspective, but only amends it. Empirical knowledge will only look at parts - or strata according to Roy Bhaskar - of reality, yet philosophy can offer a generalisable perspective or theory, and Critical Theory, Critical Realism as well as other current developments of philosophy can be seen as a thriving towards an integrated and holistic philosophy of science, which may ultimately link to an overaching theory of ethics (Parfit). If the empirical and the critical inform us, then both a philosophy of science and ethics may tell us how we may act based on our perceptions of reality.

Further Information

Some words on Critical Theory A short entry on critical realism

The author of this entry is Henrik von Wehrden.

  • Normativity of Methods

Powered by MediaWiki

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of jintell

Critical Thinking: A Model of Intelligence for Solving Real-World Problems

Diane f. halpern.

1 Department of Psychology, Claremont McKenna College, Emerita, Altadena, CA 91001, USA

Dana S. Dunn

2 Department of Psychology, Moravian College, Bethlehem, PA 18018, USA; ude.naivarom@nnud

Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. A high IQ is correlated with many important outcomes (e.g., academic prominence, reduced crime), but it does not protect against cognitive biases, partisan thinking, reactance, or confirmation bias, among others. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests. Similarly, some scholars argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Other investigators advocate for critical thinking as a model of intelligence specifically designed for addressing real-world problems. Yes, intelligence (i.e., critical thinking) can be enhanced and used for solving a real-world problem such as COVID-19, which we use as an example of contemporary problems that need a new approach.

1. Introduction

The editors of this Special Issue asked authors to respond to a deceptively simple statement: “How Intelligence Can Be a Solution to Consequential World Problems.” This statement holds many complexities, including how intelligence is defined and which theories are designed to address real-world problems.

2. The Problem with Using Standardized IQ Measures for Real-World Problems

For the most part, we identify high intelligence as having a high score on a standardized test of intelligence. Like any test score, IQ can only reflect what is on the given test. Most contemporary standardized measures of intelligence include vocabulary, working memory, spatial skills, analogies, processing speed, and puzzle-like elements (e.g., Wechsler Adult Intelligence Scale Fourth Edition; see ( Drozdick et al. 2012 )). Measures of IQ correlate with many important outcomes, including academic performance ( Kretzschmar et al. 2016 ), job-related skills ( Hunter and Schmidt 1996 ), reduced likelihood of criminal behavior ( Burhan et al. 2014 ), and for those with exceptionally high IQs, obtaining a doctorate and publishing scholarly articles ( McCabe et al. 2020 ). Gottfredson ( 1997, p. 81 ) summarized these effects when she said the “predictive validity of g is ubiquitous.” More recent research using longitudinal data, found that general mental abilities and specific abilities are good predictors of several work variables including job prestige, and income ( Lang and Kell 2020 ). Although assessments of IQ are useful in many contexts, having a high IQ does not protect against falling for common cognitive fallacies (e.g., blind spot bias, reactance, anecdotal reasoning), relying on biased and blatantly one-sided information sources, failing to consider information that does not conform to one’s preferred view of reality (confirmation bias), resisting pressure to think and act in a certain way, among others. This point was clearly articulated by Stanovich ( 2009, p. 3 ) when he stated that,” IQ tests measure only a small set of the thinking abilities that people need.”

3. Which Theories of Intelligence Are Relevant to the Question?

Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. For example, Grossmann et al. ( 2013 ) cite many studies in which IQ scores have not predicted well-being, including life satisfaction and longevity. Using a stratified random sample of Americans, these investigators found that wise reasoning is associated with life satisfaction, and that “there was no association between intelligence and well-being” (p. 944). (critical thinking [CT] is often referred to as “wise reasoning” or “rational thinking,”). Similar results were reported by Wirthwein and Rost ( 2011 ) who compared life satisfaction in several domains for gifted adults and adults of average intelligence. There were no differences in any of the measures of subjective well-being, except for leisure, which was significantly lower for the gifted adults. Additional research in a series of experiments by Stanovich and West ( 2008 ) found that participants with high cognitive ability were as likely as others to endorse positions that are consistent with their biases, and they were equally likely to prefer one-sided arguments over those that provided a balanced argument. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests (e.g., Sternberg 2019 ). Similarly, Stanovich and West ( 2014 ) argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Halpern and Butler ( 2020 ) advocate for CT as a useful model of intelligence for addressing real-world problems because it was designed for this purpose. Although there is much overlap among these more recent theories, often using different terms for similar concepts, we use Halpern and Butler’s conceptualization to make our point: Yes, intelligence (i.e., CT) can be enhanced and used for solving a real-world problem like COVID-19.

4. Critical Thinking as an Applied Model for Intelligence

One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson ( 2020, p. 205 ): “the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life.” Using this definition, the question of whether intelligent thinking can solve a world problem like the novel coronavirus is a resounding “yes” because solutions to real-world novel problems are part of his definition. This is a popular idea in the general public. For example, over 1000 business managers and hiring executives said that they want employees who can think critically based on the belief that CT skills will help them solve work-related problems ( Hart Research Associates 2018 ).

We define CT as the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal directed--the kind of thinking involved in solving problems, formulating inferences, calculating likelihoods, and making decisions, when the thinker is using skills that are thoughtful and effective for the particular context and type of thinking task. International surveys conducted by the OECD ( 2019, p. 16 ) established “key information-processing competencies” that are “highly transferable, in that they are relevant to many social contexts and work situations; and ‘learnable’ and therefore subject to the influence of policy.” One of these skills is problem solving, which is one subset of CT skills.

The CT model of intelligence is comprised of two components: (1) understanding information at a deep, meaningful level and (2) appropriate use of CT skills. The underlying idea is that CT skills can be identified, taught, and learned, and when they are recognized and applied in novel settings, the individual is demonstrating intelligent thought. CT skills include judging the credibility of an information source, making cost–benefit calculations, recognizing regression to the mean, understanding the limits of extrapolation, muting reactance responses, using analogical reasoning, rating the strength of reasons that support and fail to support a conclusion, and recognizing hindsight bias or confirmation bias, among others. Critical thinkers use these skills appropriately, without prompting, and usually with conscious intent in a variety of settings.

One of the key concepts in this model is that CT skills transfer in appropriate situations. Thus, assessments using situational judgments are needed to assess whether particular skills have transferred to a novel situation where it is appropriate. In an assessment created by the first author ( Halpern 2018 ), short paragraphs provide information about 20 different everyday scenarios (e.g., A speaker at the meeting of your local school board reported that when drug use rises, grades decline; so schools need to enforce a “war on drugs” to improve student grades); participants provide two response formats for every scenario: (a) constructed responses where they respond with short written responses, followed by (b) forced choice responses (e.g., multiple choice, rating or ranking of alternatives) for the same situations.

There is a large and growing empirical literature to support the assertion that CT skills can be learned and will transfer (when taught for transfer). See for example, Holmes et al. ( 2015 ), who wrote in the prestigious Proceedings of the National Academy of Sciences , that there was “significant and sustained improvement in students’ critical thinking behavior” (p. 11,199) for students who received CT instruction. Abrami et al. ( 2015, para. 1 ) concluded from a meta-analysis that “there are effective strategies for teaching CT skills, both generic and content specific, and CT dispositions, at all educational levels and across all disciplinary areas.” Abrami et al. ( 2008, para. 1 ), included 341 effect sizes in a meta-analysis. They wrote: “findings make it clear that improvement in students’ CT skills and dispositions cannot be a matter of implicit expectation.” A strong test of whether CT skills can be used for real-word problems comes from research by Butler et al. ( 2017 ). Community adults and college students (N = 244) completed several scales including an assessment of CT, an intelligence test, and an inventory of real-life events. Both CT scores and intelligence scores predicted individual outcomes on the inventory of real-life events, but CT was a stronger predictor.

Heijltjes et al. ( 2015, p. 487 ) randomly assigned participants to either a CT instruction group or one of six other control conditions. They found that “only participants assigned to CT instruction improved their reasoning skills.” Similarly, when Halpern et al. ( 2012 ) used random assignment of participants to either a learning group where they were taught scientific reasoning skills using a game format or a control condition (which also used computerized learning and was similar in length), participants in the scientific skills learning group showed higher proportional learning gains than students who did not play the game. As the body of additional supportive research is too large to report here, interested readers can find additional lists of CT skills and support for the assertion that these skills can be learned and will transfer in Halpern and Dunn ( Forthcoming ). There is a clear need for more high-quality research on the application and transfer of CT and its relationship to IQ.

5. Pandemics: COVID-19 as a Consequential Real-World Problem

A pandemic occurs when a disease runs rampant over an entire country or even the world. Pandemics have occurred throughout history: At the time of writing this article, COVID-19 is a world-wide pandemic whose actual death rate is unknown but estimated with projections of several million over the course of 2021 and beyond ( Mega 2020 ). Although vaccines are available, it will take some time to inoculate most or much of the world’s population. Since March 2020, national and international health agencies have created a list of actions that can slow and hopefully stop the spread of COVID (e.g., wearing face masks, practicing social distancing, avoiding group gatherings), yet many people in the United States and other countries have resisted their advice.

Could instruction in CT encourage more people to accept and comply with simple life-saving measures? There are many possible reasons to believe that by increasing citizens’ CT abilities, this problematic trend can be reversed for, at least, some unknown percentage of the population. We recognize the long history of social and cognitive research showing that changing attitudes and behaviors is difficult, and it would be unrealistic to expect that individuals with extreme beliefs supported by their social group and consistent with their political ideologies are likely to change. For example, an Iranian cleric and an orthodox rabbi both claimed (separately) that the COVID-19 vaccine can make people gay ( Marr 2021 ). These unfounded opinions are based on deeply held prejudicial beliefs that we expect to be resistant to CT. We are targeting those individuals who beliefs are less extreme and may be based on reasonable reservations, such as concern about the hasty development of the vaccine and the lack of long-term data on its effects. There should be some unknown proportion of individuals who can change their COVID-19-related beliefs and actions with appropriate instruction in CT. CT can be a (partial) antidote for the chaos of the modern world with armies of bots creating content on social media, political and other forces deliberately attempting to confuse issues, and almost all media labeled “fake news” by social influencers (i.e., people with followers that sometimes run to millions on various social media). Here, are some CT skills that could be helpful in getting more people to think more critically about pandemic-related issues.

Reasoning by Analogy and Judging the Credibility of the Source of Information

Early communications about the ability of masks to prevent the spread of COVID from national health agencies were not consistent. In many regions of the world, the benefits of wearing masks incited prolonged and acrimonious debates ( Tang 2020 ). However, after the initial confusion, virtually all of the global and national health organizations (e.g., WHO, National Health Service in the U. K., U. S. Centers for Disease Control and Prevention) endorse masks as a way to slow the spread of COVID ( Cheng et al. 2020 ; Chu et al. 2020 ). However, as we know, some people do not trust governmental agencies and often cite the conflicting information that was originally given as a reason for not wearing a mask. There are varied reasons for refusing to wear a mask, but the one most often cited is that it is against civil liberties ( Smith 2020 ). Reasoning by analogy is an appropriate CT skill for evaluating this belief (and a key skill in legal thinking). It might be useful to cite some of the many laws that already regulate our behavior such as, requiring health inspections for restaurants, setting speed limits, mandating seat belts when riding in a car, and establishing the age at which someone can consume alcohol. Individuals would be asked to consider how the mandate to wear a mask compares to these and other regulatory laws.

Another reason why some people resist the measures suggested by virtually every health agency concerns questions about whom to believe. Could training in CT change the beliefs and actions of even a small percentage of those opposed to wearing masks? Such training would include considering the following questions with practice across a wide domain of knowledge: (a) Does the source have sufficient expertise? (b) Is the expertise recent and relevant? (c) Is there a potential for gain by the information source, such as financial gain? (d) What would the ideal information source be and how close is the current source to the ideal? (e) Does the information source offer evidence that what they are recommending is likely to be correct? (f) Have you traced URLs to determine if the information in front of you really came from the alleged source?, etc. Of course, not everyone will respond in the same way to each question, so there is little likelihood that we would all think alike, but these questions provide a framework for evaluating credibility. Donovan et al. ( 2015 ) were successful using a similar approach to improve dynamic decision-making by asking participants to reflect on questions that relate to the decision. Imagine the effect of rigorous large-scale education in CT from elementary through secondary schools, as well as at the university-level. As stated above, empirical evidence has shown that people can become better thinkers with appropriate instruction in CT. With training, could we encourage some portion of the population to become more astute at judging the credibility of a source of information? It is an experiment worth trying.

6. Making Cost—Benefit Assessments for Actions That Would Slow the Spread of COVID-19

Historical records show that refusal to wear a mask during a pandemic is not a new reaction. The epidemic of 1918 also included mandates to wear masks, which drew public backlash. Then, as now, many people refused, even when they were told that it was a symbol of “wartime patriotism” because the 1918 pandemic occurred during World War I ( Lovelace 2020 ). CT instruction would include instruction in why and how to compute cost–benefit analyses. Estimates of “lives saved” by wearing a mask can be made meaningful with graphical displays that allow more people to understand large numbers. Gigerenzer ( 2020 ) found that people can understand risk ratios in medicine when the numbers are presented as frequencies instead of probabilities. If this information were used when presenting the likelihood of illness and death from COVID-19, could we increase the numbers of people who understand the severity of this disease? Small scale studies by Gigerenzer have shown that it is possible.

Analyzing Arguments to Determine Degree of Support for a Conclusion

The process of analyzing arguments requires that individuals rate the strength of support for and against a conclusion. By engaging in this practice, they must consider evidence and reasoning that may run counter to a preferred outcome. Kozyreva et al. ( 2020 ) call the deliberate failure to consider both supporting and conflicting data “deliberate ignorance”—avoiding or failing to consider information that could be useful in decision-making because it may collide with an existing belief. When applied to COVID-19, people would have to decide if the evidence for and against wearing a face mask is a reasonable way to stop the spread of this disease, and if they conclude that it is not, what are the costs and benefits of not wearing masks at a time when governmental health organizations are making them mandatory in public spaces? Again, we wonder if rigorous and systematic instruction in argument analysis would result in more positive attitudes and behaviors that relate to wearing a mask or other real-world problems. We believe that it is an experiment worth doing.

7. Conclusions

We believe that teaching CT is a worthwhile approach for educating the general public in order to improve reasoning and motivate actions to address, avert, or ameliorate real-world problems like the COVID-19 pandemic. Evidence suggests that CT can guide intelligent responses to societal and global problems. We are NOT claiming that CT skills will be a universal solution for the many real-world problems that we confront in contemporary society, or that everyone will substitute CT for other decision-making practices, but we do believe that systematic education in CT can help many people become better thinkers, and we believe that this is an important step toward creating a society that values and practices routine CT. The challenges are great, but the tools to tackle them are available, if we are willing to use them.

Author Contributions

Conceptualization, D.F.H. and D.S.D.; resources, D.F.H.; data curation, writing—original draft preparation, D.F.H.; writing—review and editing, D.F.H. and D.S.D. All authors have read and agreed to the published version of the manuscript.

This research received no external funding.

Institutional Review Board Statement

No IRB Review.

Informed Consent Statement

No Informed Consent.

Conflicts of Interest

The authors declare no conflict of interest.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Wade C. Anne, Surkes Michael A., Tamim Rana, Zhang Dai. Instructional interventions affecting critical thinking skills and dispositions: A Stage 1 meta-analysis. Review of Educational Research. 2008; 78 :1102–34. doi: 10.3102/0034654308326084. [ CrossRef ] [ Google Scholar ]
  • Abrami Philip C., Bernard Robert M., Borokhovski Evgueni, Waddington David I., Wade C. Anne. Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research. 2015; 85 :275–341. doi: 10.3102/0034654314551063. [ CrossRef ] [ Google Scholar ]
  • Burhan Nik Ahmad Sufian, Kurniawan Yohan, Sidek Abdul Halim, Mohamad Mohd Rosli. Crimes and the Bell curve: Th e role of people with high, average, and low intelligence. Intelligence. 2014; 47 :12–22. doi: 10.1016/j.intell.2014.08.005. [ CrossRef ] [ Google Scholar ]
  • Butler Heather A., Pentoney Christopher, Bong Maebelle P. Predicting real-world outcomes: Critical thinking ability is a better predictor of life decisions than intelligence. Thinking Skills and Creativity. 2017; 25 :38–46. doi: 10.1016/j.tsc.2017.06.005. [ CrossRef ] [ Google Scholar ]
  • Cheng Vincent Chi-Chung, Wong Shuk-Ching, Chuang Vivien Wai-Man, So Simon Yung-Chun, Chen Jonathan Hon-Kwan, Sridhar Sidharth, To Kelvin Kai-Wwang, Chan Jasper Fuk-Wu, Hung Ivan Fan-Ngai, Ho Pak-Leung, et al. The role of community-wide wearing of face mask for control of coronavirus disease 2019 (COVID-19) epidemic due to SARS-CoV-2. Journal of Infectious Disease. 2020; 81 :107–14. doi: 10.1016/j.jinf.2020.04.024. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Chu Derek K., Aki Elie A., Duda Stephanie, Solo Karla, Yaacoub Sally, Schunemann Holger J. Physical distancing, face masks, and eye protection to prevent person-to-person transmission of SARS-CoV-2 and COVID-19: A system atic review and meta-analysis. Lancet. 2020; 395 :1973–87. doi: 10.1016/S0140-6736(20)31142-9. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Donovan Sarah J., Guss C. Dominick, Naslund Dag. Improving dynamic decision-making through training and self-re flection. Judgment and Decision Making. 2015; 10 :284–95. [ Google Scholar ]
  • Drozdick Lisa Whipple, Wahlstrom Dustin, Zhu Jianjun, Weiss Lawrence G. The Wechsler Adult Intelligence Scale—Fourth Edition and the Wechsler Memory Scale—Fourth Edition. In: Flanagan Dawn P., Harrison Patti L., editors. Contemporary Intellectual as Sessment: Theories, Tests, and Issues. The Guilford Press; New York: 2012. pp. 197–223. [ Google Scholar ]
  • Gigerenzer Gerd. When all is just a click away: Is critical thinking obsolete in the digital age? In: Sternberg Robert J., Halpern Diane F., editors. Critical Thinking IN Psychology. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 197–223. [ Google Scholar ]
  • Gottfredson Linda S. Why g matters: The complexity of everyday life. Intelligence. 1997; 24 :79–132. doi: 10.1016/S0160-2896(97)90014-3. [ CrossRef ] [ Google Scholar ]
  • Grossmann Igor, Varnum Michael E. W., Na Jinkyung, Kitayama Shinobu, Nisbett Richard E. A route to well-being: Intelligence ver sus wise reasoning. Journal of Experimental Psychology: General. 2013; 142 :944–53. doi: 10.1037/a0029560. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Halpern Diane F. Halpern Critical Thinking Assessment. Schuhfried Test Publishers; Modling: 2018. [(accessed on 30 March 2021)]. Available online: www.schuhfried.com [ Google Scholar ]
  • Halpern Diane F., Butler Heather A. Is critical thinking a better model of intelligence? In: Sternberg Robert J., editor. The nature of Intelligence. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 183–96. [ Google Scholar ]
  • Halpern Diane F., Dunn Dana S. Thought and Knowledge: An Introduction to Critical Thinking. 6th ed. Taylor & Francis; New York: Forthcoming. in press. [ Google Scholar ]
  • Halpern Diane F., Millis Keith, Graesser Arthur, Butler Heather, Forsyth Carol, Cai Zhiqiang. Operation ARA: A computerized learn ing game that teaches critical thinking and scientific reasoning. Thinking Skills and Creativity. 2012; 7 :93–100. doi: 10.1016/j.tsc.2012.03.006. [ CrossRef ] [ Google Scholar ]
  • Hart Research Associates [(accessed on 30 March 2021)]; Employers Express Confidence in Colleges and Universities: See College as Worth the Investment, New Research Finds. 2018 Aug 29; Available online: https://hartresearch.com/employers-express-confidence-in-colleges-and-universities-see-college-as-worth-the-investment-new-research-finds/
  • Heijltjes Anita, Gog Tamara van, Lippink Jimmie, Paas Fred. Unraveling the effects of critical thinking instructions, practice, and self-explanation on students’ reasoning performance. Instructional Science. 2015; 43 :487–506. doi: 10.1007/s11251-015-9347-8. [ CrossRef ] [ Google Scholar ]
  • Holmes Natasha G., Wieman Carl E., Bonn DougA. Teaching critical thinking. Proceedings of the National Academy of Sciences. 2015; 112 :11199–204. doi: 10.1073/pnas.1505329112. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hunter John E., Schmidt Frank L. Intelligence and job performance: Economic and social implications. Psychology, Public Policy, and Law. 1996; 2 :447–72. doi: 10.1037/1076-8971.2.3-4.447. [ CrossRef ] [ Google Scholar ]
  • Kozyreva Anastasia, Lewandowsky Stephan, Hertwig Ralph. Citizens versus the internet: Confronting digital challenges with cognitive tools. [(accessed on 30 March 2021)]; Psychological Science in the Public Interest. 2020 21 doi: 10.1177/1529100620946707. Available online: https://www.psychologi calscience.org/publications/confronting-digital-challenges-with-cognitive-tools.html [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kretzschmar Andre, Neubert Jonas C., Wusternberg Sascha, Greiff Samuel. Construct validity of complex problem- solv ing: A comprehensive view on different facts of intelligence and school grades. Intelligence. 2016; 54 :55–69. doi: 10.1016/j.intell.2015.11.004. [ CrossRef ] [ Google Scholar ]
  • Lang Jonas W.B., Kell Harrison J. General mental ability and specific abilities: Their relative importance for extrinsic career success. Journal of Applied Psychology. 2020; 105 :1047–61. doi: 10.1037/apl0000472. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lovelace Berkeley., Jr. Medical Historians Compare the Coronavirus to the 1918 Flu Pandemic: Both Were Highly Political. [(accessed on 30 March 2021)]; CNBC. 2020 Available online: https://www.cnbc.com/2020/09/28/comparing-1918-flu-vs-corona virus.html?fbclid=IwAR1RAVRUOIdN9qqvNnMPimf5Q4XfV-pn_qdC3DwcfnPu9kavwumDI2zq9Xs
  • Marr Rhuaridh. Iranian Cleric Claims COVID-19 Vaccine Can Make People Gay. [(accessed on 30 March 2021)]; Metro Weekly. 2021 Available online: https://www.metroweekly.com/2021/02/iranian-cleric-claims-covid-19-vaccine-can-make-people-gay/
  • McCabe Kira O., Lubinski David, Benbow Camilla P. Who shines most among the brightest?: A 25-year longitudinal study of elite STEM graduate students. Journal of Personality and Social Psychology. 2020; 119 :390–416. doi: 10.1037/pspp0000239. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mega Emiliano R. COVID Has Killed more than One Million People. How Many more will Die? [(accessed on 30 March 2021)]; Nature. 2020 Available online: https://www.nature.com/articles/d41586-020-02762-y [ PubMed ]
  • Nickerson Raymond S. Developing intelligence through instruction. In: Sternberg Robert J., editor. The Cambridge Handbook of Intelligence. 2nd ed. Cambridge University Press; Cambridge: 2020. pp. 205–37. [ Google Scholar ]
  • OECD . The Survey of Adult Skills: Reader’s Companion. 3rd ed. OECD Publishing; Paris: 2019. OECD Skills Studies. [ CrossRef ] [ Google Scholar ]
  • Smith Matthew. Why won’t Britons Wear Face Masks? [(accessed on 30 March 2021)]; YouGov. 2020 Available online: https://yougov.co.uk/topics/health/articles-reports/2020/07/15/why-wont-britons-wear-face-masks
  • Stanovich Keith E. What Intelligence Tests Miss: The Psychology of Rational Thought. Yale University Press; New Haven: 2009. [ Google Scholar ]
  • Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict my-side bias and one-sided thinking biases. Thinking & Reasoning. 2008; 14 :129–67. doi: 10.1080/13546780701679764. [ CrossRef ] [ Google Scholar ]
  • Stanovich Keith E., West Richard F. What intelligence tests miss. The Psychologist. 2014; 27 :80–83. doi: 10.5840/inquiryctnews201126216. [ CrossRef ] [ Google Scholar ]
  • Sternberg Robert J. A theory of adaptive intelligence and its relation to general intelligence. Journal of Intelligence. 2019; 7 :23. doi: 10.3390/jintelligence7040023. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tang Julian W. COVID-19: Interpreting scientific evidence—Uncertainty, confusion, and delays. BMC Infectious Diseases. 2020; 20 :653. doi: 10.1186/s12879-020-05387-8. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Wirthwein Linda, Rost Detlef H. Giftedness and subjective well-being: A study with adults. Learning and Individuals Differences. 2011; 21 :182–86. doi: 10.1016/j.lindif.2011.01.001. [ CrossRef ] [ Google Scholar ]

GCFGlobal Logo

  • Get started with computers
  • Learn Microsoft Office
  • Apply for a job
  • Improve my work skills
  • Design nice-looking docs
  • Getting Started
  • Smartphones & Tablets
  • Typing Tutorial
  • Online Learning
  • Basic Internet Skills
  • Online Safety
  • Social Media
  • Zoom Basics
  • Google Docs
  • Google Sheets
  • Career Planning
  • Resume Writing
  • Cover Letters
  • Job Search and Networking
  • Business Communication
  • Entrepreneurship 101
  • Careers without College
  • Job Hunt for Today
  • 3D Printing
  • Freelancing 101
  • Personal Finance
  • Sharing Economy
  • Decision-Making
  • Graphic Design
  • Photography
  • Image Editing
  • Learning WordPress
  • Language Learning
  • Critical Thinking
  • For Educators
  • Translations
  • Staff Picks
  • English expand_more expand_less

Critical Thinking and Decision-Making  - What is Critical Thinking?

Critical thinking and decision-making  -, what is critical thinking, critical thinking and decision-making what is critical thinking.

GCFLearnFree Logo

Critical Thinking and Decision-Making: What is Critical Thinking?

Lesson 1: what is critical thinking, what is critical thinking.

Critical thinking is a term that gets thrown around a lot. You've probably heard it used often throughout the years whether it was in school, at work, or in everyday conversation. But when you stop to think about it, what exactly is critical thinking and how do you do it ?

Watch the video below to learn more about critical thinking.

Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions . It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better.

illustration of the terms logic, reasoning, and creativity

This may sound like a pretty broad definition, and that's because critical thinking is a broad skill that can be applied to so many different situations. You can use it to prepare for a job interview, manage your time better, make decisions about purchasing things, and so much more.

The process

illustration of "thoughts" inside a human brain, with several being connected and "analyzed"

As humans, we are constantly thinking . It's something we can't turn off. But not all of it is critical thinking. No one thinks critically 100% of the time... that would be pretty exhausting! Instead, it's an intentional process , something that we consciously use when we're presented with difficult problems or important decisions.

Improving your critical thinking

illustration of the questions "What do I currently know?" and "How do I know this?"

In order to become a better critical thinker, it's important to ask questions when you're presented with a problem or decision, before jumping to any conclusions. You can start with simple ones like What do I currently know? and How do I know this? These can help to give you a better idea of what you're working with and, in some cases, simplify more complex issues.  

Real-world applications

illustration of a hand holding a smartphone displaying an article that reads, "Study: Cats are better than dogs"

Let's take a look at how we can use critical thinking to evaluate online information . Say a friend of yours posts a news article on social media and you're drawn to its headline. If you were to use your everyday automatic thinking, you might accept it as fact and move on. But if you were thinking critically, you would first analyze the available information and ask some questions :

  • What's the source of this article?
  • Is the headline potentially misleading?
  • What are my friend's general beliefs?
  • Do their beliefs inform why they might have shared this?

illustration of "Super Cat Blog" and "According to survery of cat owners" being highlighted from an article on a smartphone

After analyzing all of this information, you can draw a conclusion about whether or not you think the article is trustworthy.

Critical thinking has a wide range of real-world applications . It can help you to make better decisions, become more hireable, and generally better understand the world around you.

illustration of a lightbulb, a briefcase, and the world

/en/problem-solving-and-decision-making/why-is-it-so-hard-to-make-decisions/content/

Training and Consulting

Cultivating Human Resiliency

Cultivating Human Resiliency

7 Ways to Improve Critical Thinking and Challenge Brain Bias

brain bias

In a world where we are bombarded with quick bites of information with questionable validity, the need for critical thinking is more important than ever. However, our brain is designed to make quick decisions with a limited amount of information through the processes of assimilation and generalization. This enables us to learn and adapt quicker but it also can result in a brain bias. Cultivating a resiliency toward this bias takes practice. Here are seven ways to improve critical thinking and challenge brain bias:

Look for alternative premises:   Our brain is wired to interpret information through existing contexts causing us to overly focus on information that supports our current beliefs. Counteracting this requires a deliberate shift in focus. If you believe something to be so, look for the evidence that it is not so. Actively seek information that disputes your conclusion.

Challenge cause and effect assumptions:   We tend to assume that there is a causal relationship if one event precedes another. This can be an error in thinking. To challenge this, make it a habit to ask yourself “what else might be the reason for this to happen?”

Step in someone else’s shoes :  When you find yourself judging another’s behavior, try to imagine what it would be like to be in their place. This requires more than just thinking about how you would respond to the situation. It requires considering the thoughts and feelings involved given that person’s past experiences as well as the current circumstances. This not only increases your empathy, it challenges your brain to broaden its perspective.

Expose yourself to opposing views:   Give yourself a chance to interact and dialogue with people who hold different opinions than your own. Try to keep an open mind. Don’t just focus on convincing them of what you believe. Listen closely to their rationales for what they believe. Healthy debate is a good way to grow and learn.

Consider the source:   People tend to give undue weight to anything published or broadcast. We also can be easily swayed by numbers and statistics. With the internet and social media, it is easy to mistake questionable sources as legitimate. Even legitimate news agencies or research can and often do contain bias. Understand the source of where you are getting information and ask yourself how their bias is influencing their argument. Better yet, get in the habit of checking information from more than one source and don’t limit your exposure to your own bias.

Expand your cultural experiences :  Make it a habit to read books by authors with different cultural backgrounds. See movies depicting characters from other cultures. Travel to different regions or countries and experience how others live. This opens your mind to ways of thinking and perceiving different than your limited experiences.

Think outside the box:   Don’t get trapped into thinking there are only one or two ways to do things when there might be a wide array of choices. Learn how to creatively brainstorm solutions to problems.

Good critical thinking skills keep the brain resilient.   What other ways have you found to challenge brain bias?

Share this:

Leave a reply cancel reply.

© Cultivating Human Resiliency 2022

Insert/edit link

Enter the destination URL

Or link to existing content

  • Study Guides
  • Homework Questions

Critical thinkig and problem solving

  • Skip to main content
  • Keyboard shortcuts for audio player

NPR defends its journalism after senior editor says it has lost the public's trust

David Folkenflik 2018 square

David Folkenflik

problem bias critical thinking

NPR is defending its journalism and integrity after a senior editor wrote an essay accusing it of losing the public's trust. Saul Loeb/AFP via Getty Images hide caption

NPR is defending its journalism and integrity after a senior editor wrote an essay accusing it of losing the public's trust.

NPR's top news executive defended its journalism and its commitment to reflecting a diverse array of views on Tuesday after a senior NPR editor wrote a broad critique of how the network has covered some of the most important stories of the age.

"An open-minded spirit no longer exists within NPR, and now, predictably, we don't have an audience that reflects America," writes Uri Berliner.

A strategic emphasis on diversity and inclusion on the basis of race, ethnicity and sexual orientation, promoted by NPR's former CEO, John Lansing, has fed "the absence of viewpoint diversity," Berliner writes.

NPR's chief news executive, Edith Chapin, wrote in a memo to staff Tuesday afternoon that she and the news leadership team strongly reject Berliner's assessment.

"We're proud to stand behind the exceptional work that our desks and shows do to cover a wide range of challenging stories," she wrote. "We believe that inclusion — among our staff, with our sourcing, and in our overall coverage — is critical to telling the nuanced stories of this country and our world."

NPR names tech executive Katherine Maher to lead in turbulent era

NPR names tech executive Katherine Maher to lead in turbulent era

She added, "None of our work is above scrutiny or critique. We must have vigorous discussions in the newsroom about how we serve the public as a whole."

A spokesperson for NPR said Chapin, who also serves as the network's chief content officer, would have no further comment.

Praised by NPR's critics

Berliner is a senior editor on NPR's Business Desk. (Disclosure: I, too, am part of the Business Desk, and Berliner has edited many of my past stories. He did not see any version of this article or participate in its preparation before it was posted publicly.)

Berliner's essay , titled "I've Been at NPR for 25 years. Here's How We Lost America's Trust," was published by The Free Press, a website that has welcomed journalists who have concluded that mainstream news outlets have become reflexively liberal.

Berliner writes that as a Subaru-driving, Sarah Lawrence College graduate who "was raised by a lesbian peace activist mother ," he fits the mold of a loyal NPR fan.

Yet Berliner says NPR's news coverage has fallen short on some of the most controversial stories of recent years, from the question of whether former President Donald Trump colluded with Russia in the 2016 election, to the origins of the virus that causes COVID-19, to the significance and provenance of emails leaked from a laptop owned by Hunter Biden weeks before the 2020 election. In addition, he blasted NPR's coverage of the Israel-Hamas conflict.

On each of these stories, Berliner asserts, NPR has suffered from groupthink due to too little diversity of viewpoints in the newsroom.

The essay ricocheted Tuesday around conservative media , with some labeling Berliner a whistleblower . Others picked it up on social media, including Elon Musk, who has lambasted NPR for leaving his social media site, X. (Musk emailed another NPR reporter a link to Berliner's article with a gibe that the reporter was a "quisling" — a World War II reference to someone who collaborates with the enemy.)

When asked for further comment late Tuesday, Berliner declined, saying the essay spoke for itself.

The arguments he raises — and counters — have percolated across U.S. newsrooms in recent years. The #MeToo sexual harassment scandals of 2016 and 2017 forced newsrooms to listen to and heed more junior colleagues. The social justice movement prompted by the killing of George Floyd in 2020 inspired a reckoning in many places. Newsroom leaders often appeared to stand on shaky ground.

Leaders at many newsrooms, including top editors at The New York Times and the Los Angeles Times , lost their jobs. Legendary Washington Post Executive Editor Martin Baron wrote in his memoir that he feared his bonds with the staff were "frayed beyond repair," especially over the degree of self-expression his journalists expected to exert on social media, before he decided to step down in early 2021.

Since then, Baron and others — including leaders of some of these newsrooms — have suggested that the pendulum has swung too far.

Legendary editor Marty Baron describes his 'Collision of Power' with Trump and Bezos

Author Interviews

Legendary editor marty baron describes his 'collision of power' with trump and bezos.

New York Times publisher A.G. Sulzberger warned last year against journalists embracing a stance of what he calls "one-side-ism": "where journalists are demonstrating that they're on the side of the righteous."

"I really think that that can create blind spots and echo chambers," he said.

Internal arguments at The Times over the strength of its reporting on accusations that Hamas engaged in sexual assaults as part of a strategy for its Oct. 7 attack on Israel erupted publicly . The paper conducted an investigation to determine the source of a leak over a planned episode of the paper's podcast The Daily on the subject, which months later has not been released. The newsroom guild accused the paper of "targeted interrogation" of journalists of Middle Eastern descent.

Heated pushback in NPR's newsroom

Given Berliner's account of private conversations, several NPR journalists question whether they can now trust him with unguarded assessments about stories in real time. Others express frustration that he had not sought out comment in advance of publication. Berliner acknowledged to me that for this story, he did not seek NPR's approval to publish the piece, nor did he give the network advance notice.

Some of Berliner's NPR colleagues are responding heatedly. Fernando Alfonso, a senior supervising editor for digital news, wrote that he wholeheartedly rejected Berliner's critique of the coverage of the Israel-Hamas conflict, for which NPR's journalists, like their peers, periodically put themselves at risk.

Alfonso also took issue with Berliner's concern over the focus on diversity at NPR.

"As a person of color who has often worked in newsrooms with little to no people who look like me, the efforts NPR has made to diversify its workforce and its sources are unique and appropriate given the news industry's long-standing lack of diversity," Alfonso says. "These efforts should be celebrated and not denigrated as Uri has done."

After this story was first published, Berliner contested Alfonso's characterization, saying his criticism of NPR is about the lack of diversity of viewpoints, not its diversity itself.

"I never criticized NPR's priority of achieving a more diverse workforce in terms of race, ethnicity and sexual orientation. I have not 'denigrated' NPR's newsroom diversity goals," Berliner said. "That's wrong."

Questions of diversity

Under former CEO John Lansing, NPR made increasing diversity, both of its staff and its audience, its "North Star" mission. Berliner says in the essay that NPR failed to consider broader diversity of viewpoint, noting, "In D.C., where NPR is headquartered and many of us live, I found 87 registered Democrats working in editorial positions and zero Republicans."

Berliner cited audience estimates that suggested a concurrent falloff in listening by Republicans. (The number of people listening to NPR broadcasts and terrestrial radio broadly has declined since the start of the pandemic.)

Former NPR vice president for news and ombudsman Jeffrey Dvorkin tweeted , "I know Uri. He's not wrong."

Others questioned Berliner's logic. "This probably gets causality somewhat backward," tweeted Semafor Washington editor Jordan Weissmann . "I'd guess that a lot of NPR listeners who voted for [Mitt] Romney have changed how they identify politically."

Similarly, Nieman Lab founder Joshua Benton suggested the rise of Trump alienated many NPR-appreciating Republicans from the GOP.

In recent years, NPR has greatly enhanced the percentage of people of color in its workforce and its executive ranks. Four out of 10 staffers are people of color; nearly half of NPR's leadership team identifies as Black, Asian or Latino.

"The philosophy is: Do you want to serve all of America and make sure it sounds like all of America, or not?" Lansing, who stepped down last month, says in response to Berliner's piece. "I'd welcome the argument against that."

"On radio, we were really lagging in our representation of an audience that makes us look like what America looks like today," Lansing says. The U.S. looks and sounds a lot different than it did in 1971, when NPR's first show was broadcast, Lansing says.

A network spokesperson says new NPR CEO Katherine Maher supports Chapin and her response to Berliner's critique.

The spokesperson says that Maher "believes that it's a healthy thing for a public service newsroom to engage in rigorous consideration of the needs of our audiences, including where we serve our mission well and where we can serve it better."

Disclosure: This story was reported and written by NPR Media Correspondent David Folkenflik and edited by Deputy Business Editor Emily Kopp and Managing Editor Gerry Holmes. Under NPR's protocol for reporting on itself, no NPR corporate official or news executive reviewed this story before it was posted publicly.

Orlando Sentinel

Advice | Ask Amy: Is my girlfriend’s lack of critical…

Share this:.

  • Click to share on Facebook (Opens in new window)
  • Click to share on X (Opens in new window)

Daily e-Edition

Evening e-Edition

  • Entertainment
  • Restaurants, Food & Drink

Advice | Ask Amy: Is my girlfriend’s lack of critical thinking a problem?

Plus: i can't believe what's expected of wedding guests these days.

Portrait of Columnist Amy Dickinson in the Tribune Studio on Friday, 27 June 2014 for the new web portraits.   (Bill Hogan/Chicago Tribune)  B583831731Z.1 ....OUTSIDE TRIBUNE CO.- NO MAGS,  NO SALES, NO INTERNET, NO TV, CHICAGO OUT, NO DIGITAL MANIPULATION...

Dear Amy : I need a gut check.

I’ve been with my girlfriend, “Stella,” for three years. We are in our late 20s.

Stella is great. She is gorgeous and loving and very nice. Everyone loves her. I do, too.

The problem I’m having is that she is extremely gullible. She believes whatever conspiracy nonsense has most recently floated through her social media feed. Most of this misinformation has to do with health-related issues, and because she follows and comments on it, she is fed more of it.

Her latest bit is that she believes that cellphones cause brain cancer.

She can believe whatever she wants, but now this is starting to interfere with my own life because she is trying to influence me.

I’m tired of this and thinking of breaking up with her, but this seems like a trivial reason to break up with someone who is so great in every other respect.

Can you weigh in?

Dear Bored : What a person thinks – and how a person thinks – is not a trivial matter. According to you, your girlfriend is also trying to control you.

Do you want to go through life having to defend your own rational choices?

Do you want to possibly have a family with someone whose views about health and wellness are so radically different from your own?

I sincerely doubt it.

Dear Amy : How should I react to some of the baffling requests for gifts and money when invited to wedding showers, weddings and baby showers?

I just received an invitation for my niece’s baby shower (her mom is my sister).

Request No. 1 was for a book instead of a card. OK, fine, but she is asking people to give this along with a gift.

She then said guests would be entered in a raffle if they would bring a package of diapers. This is in addition to the gift and the book.

She then said not to wrap the gift, and to have the gift sent directly to their home, so she could visit with her guests instead of opening these gifts in front of them (not, of course, because opening gifts and acknowledging the people giving them is tedious or schlepping the gifts home is annoying).

At a baby shower for a friend’s daughter, I felt I’d broken the rules when I gave a gift that was not on the registry. This was in addition to giving a wedding shower gift and a wedding gift to someone I barely know.

Am I just overly sensitive because I got married at the courthouse and don’t have kids?

Can I decline some of these events and send a not-so-extravagant gift?

Do I have to suck it up, even though I think this trend continues to bring out money grubbing expectations that have very little to do with connecting with others?

Dear Petty?: Remember this: Anyone can ask for anything. It’s a free country!

But receiving a request does not obligate you to do anything about it, except to politely RSVP to an invitation.

Back in the Stone Age when I was an expectant mother, baby showers were held in someone’s living room; gifts were opened in front of the guests and a parade of tiny onesies were held up for everyone to appreciate. The guests were thanked and acknowledged at the time and – if the expectant mother was savvy and polite – a note would be sent to each of the guests afterward.

My insight into modern baby showers comes from a few I’ve attended more recently which are held in banquet halls and attended by dozens of women. Unwrapped gifts are placed on a table and guests pick up their pre-printed “thank you” note on the way out of the venue.

(I do like the trend toward not wrapping gifts at these huge events, due to the waste.)

Registries can be extremely helpful (they tell you what the recipient wants or needs), but you are not obligated to buy a gift off of a registry.

Dear Amy: “Tricked in Illinois” believed her mother was manipulating her by asking for a ride to church.

My late mother wanted a ride to church on Christmas Eve, and I was too busy and selfish to notice.

He understood! I will always regret that day.

Dear Wanda : “Tricked in Illinois” described a history of manipulation, but I hope she will make her choice based on your perspective.

You can email Amy Dickinson at [email protected] or send a letter to Ask Amy, P.O. Box 194, Freeville, NY 13068. You can also follow her on Twitter @askingamy or Facebook.

More in Advice

Amy Dickinson

Advice | Ask Amy: How do I tell my boyfriend I don’t ever want to go to his house again?

Amy Dickinson

Advice | Ask Amy: I cringe at the name my daughter has picked for her baby

A reader asks real estate lawyer Gary Singer: How can I get my HOA to allow car covers in the community?

SUBSCRIBER ONLY

Real estate | ask a real estate pro: how can i get my hoa to allow car covers.

Amy Dickinson

Advice | Ask Amy: For years, I accepted his behavior. Then I overheard him talking about me.

IMAGES

  1. Infographic : 18 Cognitive Bias Examples Show Why Mental Mistakes Get Made

    problem bias critical thinking

  2. Critical thinking, Logic & Problem Solving: The Ultimate Guide to

    problem bias critical thinking

  3. 10 Essential Critical Thinking Skills (And How to Improve Them

    problem bias critical thinking

  4. Critical Thinking

    problem bias critical thinking

  5. Critical Thinking Activity

    problem bias critical thinking

  6. Critical thinking

    problem bias critical thinking

VIDEO

  1. Critical Thinking Concepts: Status Quo Bias

  2. Confirming you were right! #bias #confirmationbias #shorts

  3. Breaking the Bias: Understanding Confirmation Bias in Decision-Making

  4. Revision Bias: The Tendency to Favor Revisions

  5. Steering Clear of the Trap (Understanding Hindsight Bias)

  6. How DEI Is White Liberal Supremacy

COMMENTS

  1. Cognitive Bias Is the Loose Screw in Critical Thinking

    Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking. Wikipedia lists 197 ...

  2. Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

    Sources of Bias. Recognizing bias is an essential part of problem solving and critical thinking. It is important to be aware of potential sources of bias, such as personal opinions, values, or preconceived notions. Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence.

  3. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  4. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist ...

  5. Bias

    Wittebols (2019) defines it as a "tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe" (p. 211). Quite simply, we may reject information that doesn't support our existing thinking. This can manifest in a number of ways with Hahn and Harris (2014 ...

  6. Critical Thinking

    What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? ... 2017, "Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration", Informal Logic, 37(4): 351-369. [Dalgleish et al. available ...

  7. Bias on the brain: A Yale psychologist examines common 'thinking

    The confirmation bias becomes a thinking problem when it makes us draw conclusions that are unfair to ourselves or others. For example, let's say a CEO of a company hires only white people for their top executive positions. They all do a reasonably good job, so now the CEO believes that race matters and continues hiring only white people.

  8. Cognitive Biases and Their Influence on Critical Thinking and

    Researchers have discovered 200 cognitive biases that result in inaccurate or irrational judgments and decisions, ranging from actor-observer to zero risk bias.

  9. A Crash Course in Critical Thinking

    Here is a series of questions you can ask yourself to try to ensure that you are thinking critically. Conspiracy theories. Inability to distinguish facts from falsehoods. Widespread confusion ...

  10. What Is Cognitive Bias?

    Thinking about these things and challenging your biases can make you a more critical thinker. Reducing cognitive bias may also be beneficial in the treatment of some mental health conditions. Cognitive bias modification therapy (CBMT) is a treatment approach based on processes that are designed to reduce cognitive bias.

  11. What Are Critical Thinking Skills and Why Are They Important?

    It makes you a well-rounded individual, one who has looked at all of their options and possible solutions before making a choice. According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills.

  12. Cognitive Bias List: 13 Common Types of Bias

    The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases. These biases distort thinking, influence beliefs, and sway the decisions and judgments that people make each and every day.

  13. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  14. 7.4: Critical Thinking

    Over time, attitudes, evidence, and opinions change, and as a critical thinker, you must continue to research, synthesize newly discovered evidence, and adapt to that new information. Figure 7.4.11 7.4. 11: Information, attitudes, laws, and acceptance of smoking changed dramatically over time. More recently, vaping and related practices have ...

  15. Bridging critical thinking and transformative learning: The role of

    In part one, I focus on critical thinking to investigate what ability and/or disposition can help thinkers arouse a state of doubt. I first consider traditional dispositions of critical thinking, specifically reflection and open-mindedness, and argue that they are largely ineffective as they do not confront the problem of cognitive bias.

  16. Critical Thinking Models: A Comprehensive Guide for Effective Decision

    In decision making, critical thinking is a vital skill that allows individuals to make informed choices. It enables them to: Analyze options and their potential consequences. Evaluate the credibility of sources and the quality of information. Identify biases, assumptions, and values that may influence the decision.

  17. 2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

    Confirmation Bias. One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs.Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality.

  18. Bias and Critical Thinking

    The largest bias concerning methods is the choice of methods per se. A critical perspective is thus not only of relevance from a perspective of societal responsibility, but equally from a view on the empirical. Clear documentation and reproducibility of research are important but limited stepping stones in a critique of the methodological.

  19. Critical Thinking: A Model of Intelligence for Solving Real-World

    4. Critical Thinking as an Applied Model for Intelligence. One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson (2020, p. 205): "the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life."

  20. Critical Thinking and Decision-Making

    Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions. It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better. This may sound like a pretty broad definition, and that's because critical thinking is a ...

  21. Fair Critical Thinking Evaluation: Addressing Bias

    Critical thinking is a cornerstone of problem-solving and decision-making, but what happens when the feedback and evaluation process you rely on is riddled with bias? It's a predicament that can ...

  22. Enhance HR Problem-Solving with Critical Thinking

    Critical thinking starts with self-awareness, particularly regarding biases. As an HR consultant, you must recognize your own biases and understand how they can cloud judgement.

  23. 7 Ways to Improve Critical Thinking and Challenge Brain Bias

    Here are seven ways to improve critical thinking and challenge brain bias: Look for alternative premises: Our brain is wired to interpret information through existing contexts causing us to overly focus on information that supports our current beliefs. Counteracting this requires a deliberate shift in focus. If you believe something to be so ...

  24. PDF A Case Study on Students' Critical Thinking in Online Learning

    Critical thinking skills are one of the crucial goals in the curriculum. The importance of critical thinking skills is detailed in Figure 1. Figure 1: HOTS-oriented learning grand design (Ariyana et al., 2018) The critical thinking skills in this study were marked by various achievement indicators, consisting

  25. M433: Critical Thinking and Problem Solving Advance Sheet

    Analyzing a contemporary issue confronting today's military helps to accentuate the concepts in problem solving and critical thinking. It addresses several topic areas to include Paul and Elder's model for critical thinking and some of the possible problems with our thinking to include bias, group think, and faulty paradigms.

  26. Critical thinkig and problem solving (docx)

    2 Critical Thinking and Problem Solving Critical thinking and problem solving relate to the capacity to apply information, facts, and data to solve issues successfully. This implies that you must possess quick thinking, critical thinking, and problem-solving skills (Problem Solving and Critical Thinking, n.d.). Businesses claim that in order to expand, they want a workforce that possesses a ...

  27. NPR responds after editor says it has 'lost America's trust' : NPR

    A veteran NPR editor publicly questions whether the public radio network has, in its push for greater diversity and representation, overlooked conservative viewpoints.

  28. Ask Amy: Is my girlfriend's lack of critical thinking a problem?

    Dear Amy: I need a gut check. I've been with my girlfriend, "Stella," for three years. We are in our late 20s. Stella is great. She is gorgeous and loving and very nice. Everyone loves her ...