Logo for Open Educational Resources

Chapter 2. Research Design

Getting started.

When I teach undergraduates qualitative research methods, the final product of the course is a “research proposal” that incorporates all they have learned and enlists the knowledge they have learned about qualitative research methods in an original design that addresses a particular research question. I highly recommend you think about designing your own research study as you progress through this textbook. Even if you don’t have a study in mind yet, it can be a helpful exercise as you progress through the course. But how to start? How can one design a research study before they even know what research looks like? This chapter will serve as a brief overview of the research design process to orient you to what will be coming in later chapters. Think of it as a “skeleton” of what you will read in more detail in later chapters. Ideally, you will read this chapter both now (in sequence) and later during your reading of the remainder of the text. Do not worry if you have questions the first time you read this chapter. Many things will become clearer as the text advances and as you gain a deeper understanding of all the components of good qualitative research. This is just a preliminary map to get you on the right road.

Null

Research Design Steps

Before you even get started, you will need to have a broad topic of interest in mind. [1] . In my experience, students can confuse this broad topic with the actual research question, so it is important to clearly distinguish the two. And the place to start is the broad topic. It might be, as was the case with me, working-class college students. But what about working-class college students? What’s it like to be one? Why are there so few compared to others? How do colleges assist (or fail to assist) them? What interested me was something I could barely articulate at first and went something like this: “Why was it so difficult and lonely to be me?” And by extension, “Did others share this experience?”

Once you have a general topic, reflect on why this is important to you. Sometimes we connect with a topic and we don’t really know why. Even if you are not willing to share the real underlying reason you are interested in a topic, it is important that you know the deeper reasons that motivate you. Otherwise, it is quite possible that at some point during the research, you will find yourself turned around facing the wrong direction. I have seen it happen many times. The reason is that the research question is not the same thing as the general topic of interest, and if you don’t know the reasons for your interest, you are likely to design a study answering a research question that is beside the point—to you, at least. And this means you will be much less motivated to carry your research to completion.

Researcher Note

Why do you employ qualitative research methods in your area of study? What are the advantages of qualitative research methods for studying mentorship?

Qualitative research methods are a huge opportunity to increase access, equity, inclusion, and social justice. Qualitative research allows us to engage and examine the uniquenesses/nuances within minoritized and dominant identities and our experiences with these identities. Qualitative research allows us to explore a specific topic, and through that exploration, we can link history to experiences and look for patterns or offer up a unique phenomenon. There’s such beauty in being able to tell a particular story, and qualitative research is a great mode for that! For our work, we examined the relationships we typically use the term mentorship for but didn’t feel that was quite the right word. Qualitative research allowed us to pick apart what we did and how we engaged in our relationships, which then allowed us to more accurately describe what was unique about our mentorship relationships, which we ultimately named liberationships ( McAloney and Long 2021) . Qualitative research gave us the means to explore, process, and name our experiences; what a powerful tool!

How do you come up with ideas for what to study (and how to study it)? Where did you get the idea for studying mentorship?

Coming up with ideas for research, for me, is kind of like Googling a question I have, not finding enough information, and then deciding to dig a little deeper to get the answer. The idea to study mentorship actually came up in conversation with my mentorship triad. We were talking in one of our meetings about our relationship—kind of meta, huh? We discussed how we felt that mentorship was not quite the right term for the relationships we had built. One of us asked what was different about our relationships and mentorship. This all happened when I was taking an ethnography course. During the next session of class, we were discussing auto- and duoethnography, and it hit me—let’s explore our version of mentorship, which we later went on to name liberationships ( McAloney and Long 2021 ). The idea and questions came out of being curious and wanting to find an answer. As I continue to research, I see opportunities in questions I have about my work or during conversations that, in our search for answers, end up exposing gaps in the literature. If I can’t find the answer already out there, I can study it.

—Kim McAloney, PhD, College Student Services Administration Ecampus coordinator and instructor

When you have a better idea of why you are interested in what it is that interests you, you may be surprised to learn that the obvious approaches to the topic are not the only ones. For example, let’s say you think you are interested in preserving coastal wildlife. And as a social scientist, you are interested in policies and practices that affect the long-term viability of coastal wildlife, especially around fishing communities. It would be natural then to consider designing a research study around fishing communities and how they manage their ecosystems. But when you really think about it, you realize that what interests you the most is how people whose livelihoods depend on a particular resource act in ways that deplete that resource. Or, even deeper, you contemplate the puzzle, “How do people justify actions that damage their surroundings?” Now, there are many ways to design a study that gets at that broader question, and not all of them are about fishing communities, although that is certainly one way to go. Maybe you could design an interview-based study that includes and compares loggers, fishers, and desert golfers (those who golf in arid lands that require a great deal of wasteful irrigation). Or design a case study around one particular example where resources were completely used up by a community. Without knowing what it is you are really interested in, what motivates your interest in a surface phenomenon, you are unlikely to come up with the appropriate research design.

These first stages of research design are often the most difficult, but have patience . Taking the time to consider why you are going to go through a lot of trouble to get answers will prevent a lot of wasted energy in the future.

There are distinct reasons for pursuing particular research questions, and it is helpful to distinguish between them.  First, you may be personally motivated.  This is probably the most important and the most often overlooked.   What is it about the social world that sparks your curiosity? What bothers you? What answers do you need in order to keep living? For me, I knew I needed to get a handle on what higher education was for before I kept going at it. I needed to understand why I felt so different from my peers and whether this whole “higher education” thing was “for the likes of me” before I could complete my degree. That is the personal motivation question. Your personal motivation might also be political in nature, in that you want to change the world in a particular way. It’s all right to acknowledge this. In fact, it is better to acknowledge it than to hide it.

There are also academic and professional motivations for a particular study.  If you are an absolute beginner, these may be difficult to find. We’ll talk more about this when we discuss reviewing the literature. Simply put, you are probably not the only person in the world to have thought about this question or issue and those related to it. So how does your interest area fit into what others have studied? Perhaps there is a good study out there of fishing communities, but no one has quite asked the “justification” question. You are motivated to address this to “fill the gap” in our collective knowledge. And maybe you are really not at all sure of what interests you, but you do know that [insert your topic] interests a lot of people, so you would like to work in this area too. You want to be involved in the academic conversation. That is a professional motivation and a very important one to articulate.

Practical and strategic motivations are a third kind. Perhaps you want to encourage people to take better care of the natural resources around them. If this is also part of your motivation, you will want to design your research project in a way that might have an impact on how people behave in the future. There are many ways to do this, one of which is using qualitative research methods rather than quantitative research methods, as the findings of qualitative research are often easier to communicate to a broader audience than the results of quantitative research. You might even be able to engage the community you are studying in the collecting and analyzing of data, something taboo in quantitative research but actively embraced and encouraged by qualitative researchers. But there are other practical reasons, such as getting “done” with your research in a certain amount of time or having access (or no access) to certain information. There is nothing wrong with considering constraints and opportunities when designing your study. Or maybe one of the practical or strategic goals is about learning competence in this area so that you can demonstrate the ability to conduct interviews and focus groups with future employers. Keeping that in mind will help shape your study and prevent you from getting sidetracked using a technique that you are less invested in learning about.

STOP HERE for a moment

I recommend you write a paragraph (at least) explaining your aims and goals. Include a sentence about each of the following: personal/political goals, practical or professional/academic goals, and practical/strategic goals. Think through how all of the goals are related and can be achieved by this particular research study . If they can’t, have a rethink. Perhaps this is not the best way to go about it.

You will also want to be clear about the purpose of your study. “Wait, didn’t we just do this?” you might ask. No! Your goals are not the same as the purpose of the study, although they are related. You can think about purpose lying on a continuum from “ theory ” to “action” (figure 2.1). Sometimes you are doing research to discover new knowledge about the world, while other times you are doing a study because you want to measure an impact or make a difference in the world.

Purpose types: Basic Research, Applied Research, Summative Evaluation, Formative Evaluation, Action Research

Basic research involves research that is done for the sake of “pure” knowledge—that is, knowledge that, at least at this moment in time, may not have any apparent use or application. Often, and this is very important, knowledge of this kind is later found to be extremely helpful in solving problems. So one way of thinking about basic research is that it is knowledge for which no use is yet known but will probably one day prove to be extremely useful. If you are doing basic research, you do not need to argue its usefulness, as the whole point is that we just don’t know yet what this might be.

Researchers engaged in basic research want to understand how the world operates. They are interested in investigating a phenomenon to get at the nature of reality with regard to that phenomenon. The basic researcher’s purpose is to understand and explain ( Patton 2002:215 ).

Basic research is interested in generating and testing hypotheses about how the world works. Grounded Theory is one approach to qualitative research methods that exemplifies basic research (see chapter 4). Most academic journal articles publish basic research findings. If you are working in academia (e.g., writing your dissertation), the default expectation is that you are conducting basic research.

Applied research in the social sciences is research that addresses human and social problems. Unlike basic research, the researcher has expectations that the research will help contribute to resolving a problem, if only by identifying its contours, history, or context. From my experience, most students have this as their baseline assumption about research. Why do a study if not to make things better? But this is a common mistake. Students and their committee members are often working with default assumptions here—the former thinking about applied research as their purpose, the latter thinking about basic research: “The purpose of applied research is to contribute knowledge that will help people to understand the nature of a problem in order to intervene, thereby allowing human beings to more effectively control their environment. While in basic research the source of questions is the tradition within a scholarly discipline, in applied research the source of questions is in the problems and concerns experienced by people and by policymakers” ( Patton 2002:217 ).

Applied research is less geared toward theory in two ways. First, its questions do not derive from previous literature. For this reason, applied research studies have much more limited literature reviews than those found in basic research (although they make up for this by having much more “background” about the problem). Second, it does not generate theory in the same way as basic research does. The findings of an applied research project may not be generalizable beyond the boundaries of this particular problem or context. The findings are more limited. They are useful now but may be less useful later. This is why basic research remains the default “gold standard” of academic research.

Evaluation research is research that is designed to evaluate or test the effectiveness of specific solutions and programs addressing specific social problems. We already know the problems, and someone has already come up with solutions. There might be a program, say, for first-generation college students on your campus. Does this program work? Are first-generation students who participate in the program more likely to graduate than those who do not? These are the types of questions addressed by evaluation research. There are two types of research within this broader frame; however, one more action-oriented than the next. In summative evaluation , an overall judgment about the effectiveness of a program or policy is made. Should we continue our first-gen program? Is it a good model for other campuses? Because the purpose of such summative evaluation is to measure success and to determine whether this success is scalable (capable of being generalized beyond the specific case), quantitative data is more often used than qualitative data. In our example, we might have “outcomes” data for thousands of students, and we might run various tests to determine if the better outcomes of those in the program are statistically significant so that we can generalize the findings and recommend similar programs elsewhere. Qualitative data in the form of focus groups or interviews can then be used for illustrative purposes, providing more depth to the quantitative analyses. In contrast, formative evaluation attempts to improve a program or policy (to help “form” or shape its effectiveness). Formative evaluations rely more heavily on qualitative data—case studies, interviews, focus groups. The findings are meant not to generalize beyond the particular but to improve this program. If you are a student seeking to improve your qualitative research skills and you do not care about generating basic research, formative evaluation studies might be an attractive option for you to pursue, as there are always local programs that need evaluation and suggestions for improvement. Again, be very clear about your purpose when talking through your research proposal with your committee.

Action research takes a further step beyond evaluation, even formative evaluation, to being part of the solution itself. This is about as far from basic research as one could get and definitely falls beyond the scope of “science,” as conventionally defined. The distinction between action and research is blurry, the research methods are often in constant flux, and the only “findings” are specific to the problem or case at hand and often are findings about the process of intervention itself. Rather than evaluate a program as a whole, action research often seeks to change and improve some particular aspect that may not be working—maybe there is not enough diversity in an organization or maybe women’s voices are muted during meetings and the organization wonders why and would like to change this. In a further step, participatory action research , those women would become part of the research team, attempting to amplify their voices in the organization through participation in the action research. As action research employs methods that involve people in the process, focus groups are quite common.

If you are working on a thesis or dissertation, chances are your committee will expect you to be contributing to fundamental knowledge and theory ( basic research ). If your interests lie more toward the action end of the continuum, however, it is helpful to talk to your committee about this before you get started. Knowing your purpose in advance will help avoid misunderstandings during the later stages of the research process!

The Research Question

Once you have written your paragraph and clarified your purpose and truly know that this study is the best study for you to be doing right now , you are ready to write and refine your actual research question. Know that research questions are often moving targets in qualitative research, that they can be refined up to the very end of data collection and analysis. But you do have to have a working research question at all stages. This is your “anchor” when you get lost in the data. What are you addressing? What are you looking at and why? Your research question guides you through the thicket. It is common to have a whole host of questions about a phenomenon or case, both at the outset and throughout the study, but you should be able to pare it down to no more than two or three sentences when asked. These sentences should both clarify the intent of the research and explain why this is an important question to answer. More on refining your research question can be found in chapter 4.

Chances are, you will have already done some prior reading before coming up with your interest and your questions, but you may not have conducted a systematic literature review. This is the next crucial stage to be completed before venturing further. You don’t want to start collecting data and then realize that someone has already beaten you to the punch. A review of the literature that is already out there will let you know (1) if others have already done the study you are envisioning; (2) if others have done similar studies, which can help you out; and (3) what ideas or concepts are out there that can help you frame your study and make sense of your findings. More on literature reviews can be found in chapter 9.

In addition to reviewing the literature for similar studies to what you are proposing, it can be extremely helpful to find a study that inspires you. This may have absolutely nothing to do with the topic you are interested in but is written so beautifully or organized so interestingly or otherwise speaks to you in such a way that you want to post it somewhere to remind you of what you want to be doing. You might not understand this in the early stages—why would you find a study that has nothing to do with the one you are doing helpful? But trust me, when you are deep into analysis and writing, having an inspirational model in view can help you push through. If you are motivated to do something that might change the world, you probably have read something somewhere that inspired you. Go back to that original inspiration and read it carefully and see how they managed to convey the passion that you so appreciate.

At this stage, you are still just getting started. There are a lot of things to do before setting forth to collect data! You’ll want to consider and choose a research tradition and a set of data-collection techniques that both help you answer your research question and match all your aims and goals. For example, if you really want to help migrant workers speak for themselves, you might draw on feminist theory and participatory action research models. Chapters 3 and 4 will provide you with more information on epistemologies and approaches.

Next, you have to clarify your “units of analysis.” What is the level at which you are focusing your study? Often, the unit in qualitative research methods is individual people, or “human subjects.” But your units of analysis could just as well be organizations (colleges, hospitals) or programs or even whole nations. Think about what it is you want to be saying at the end of your study—are the insights you are hoping to make about people or about organizations or about something else entirely? A unit of analysis can even be a historical period! Every unit of analysis will call for a different kind of data collection and analysis and will produce different kinds of “findings” at the conclusion of your study. [2]

Regardless of what unit of analysis you select, you will probably have to consider the “human subjects” involved in your research. [3] Who are they? What interactions will you have with them—that is, what kind of data will you be collecting? Before answering these questions, define your population of interest and your research setting. Use your research question to help guide you.

Let’s use an example from a real study. In Geographies of Campus Inequality , Benson and Lee ( 2020 ) list three related research questions: “(1) What are the different ways that first-generation students organize their social, extracurricular, and academic activities at selective and highly selective colleges? (2) how do first-generation students sort themselves and get sorted into these different types of campus lives; and (3) how do these different patterns of campus engagement prepare first-generation students for their post-college lives?” (3).

Note that we are jumping into this a bit late, after Benson and Lee have described previous studies (the literature review) and what is known about first-generation college students and what is not known. They want to know about differences within this group, and they are interested in ones attending certain kinds of colleges because those colleges will be sites where academic and extracurricular pressures compete. That is the context for their three related research questions. What is the population of interest here? First-generation college students . What is the research setting? Selective and highly selective colleges . But a host of questions remain. Which students in the real world, which colleges? What about gender, race, and other identity markers? Will the students be asked questions? Are the students still in college, or will they be asked about what college was like for them? Will they be observed? Will they be shadowed? Will they be surveyed? Will they be asked to keep diaries of their time in college? How many students? How many colleges? For how long will they be observed?

Recommendation

Take a moment and write down suggestions for Benson and Lee before continuing on to what they actually did.

Have you written down your own suggestions? Good. Now let’s compare those with what they actually did. Benson and Lee drew on two sources of data: in-depth interviews with sixty-four first-generation students and survey data from a preexisting national survey of students at twenty-eight selective colleges. Let’s ignore the survey for our purposes here and focus on those interviews. The interviews were conducted between 2014 and 2016 at a single selective college, “Hilltop” (a pseudonym ). They employed a “purposive” sampling strategy to ensure an equal number of male-identifying and female-identifying students as well as equal numbers of White, Black, and Latinx students. Each student was interviewed once. Hilltop is a selective liberal arts college in the northeast that enrolls about three thousand students.

How did your suggestions match up to those actually used by the researchers in this study? It is possible your suggestions were too ambitious? Beginning qualitative researchers can often make that mistake. You want a research design that is both effective (it matches your question and goals) and doable. You will never be able to collect data from your entire population of interest (unless your research question is really so narrow to be relevant to very few people!), so you will need to come up with a good sample. Define the criteria for this sample, as Benson and Lee did when deciding to interview an equal number of students by gender and race categories. Define the criteria for your sample setting too. Hilltop is typical for selective colleges. That was a research choice made by Benson and Lee. For more on sampling and sampling choices, see chapter 5.

Benson and Lee chose to employ interviews. If you also would like to include interviews, you have to think about what will be asked in them. Most interview-based research involves an interview guide, a set of questions or question areas that will be asked of each participant. The research question helps you create a relevant interview guide. You want to ask questions whose answers will provide insight into your research question. Again, your research question is the anchor you will continually come back to as you plan for and conduct your study. It may be that once you begin interviewing, you find that people are telling you something totally unexpected, and this makes you rethink your research question. That is fine. Then you have a new anchor. But you always have an anchor. More on interviewing can be found in chapter 11.

Let’s imagine Benson and Lee also observed college students as they went about doing the things college students do, both in the classroom and in the clubs and social activities in which they participate. They would have needed a plan for this. Would they sit in on classes? Which ones and how many? Would they attend club meetings and sports events? Which ones and how many? Would they participate themselves? How would they record their observations? More on observation techniques can be found in both chapters 13 and 14.

At this point, the design is almost complete. You know why you are doing this study, you have a clear research question to guide you, you have identified your population of interest and research setting, and you have a reasonable sample of each. You also have put together a plan for data collection, which might include drafting an interview guide or making plans for observations. And so you know exactly what you will be doing for the next several months (or years!). To put the project into action, there are a few more things necessary before actually going into the field.

First, you will need to make sure you have any necessary supplies, including recording technology. These days, many researchers use their phones to record interviews. Second, you will need to draft a few documents for your participants. These include informed consent forms and recruiting materials, such as posters or email texts, that explain what this study is in clear language. Third, you will draft a research protocol to submit to your institutional review board (IRB) ; this research protocol will include the interview guide (if you are using one), the consent form template, and all examples of recruiting material. Depending on your institution and the details of your study design, it may take weeks or even, in some unfortunate cases, months before you secure IRB approval. Make sure you plan on this time in your project timeline. While you wait, you can continue to review the literature and possibly begin drafting a section on the literature review for your eventual presentation/publication. More on IRB procedures can be found in chapter 8 and more general ethical considerations in chapter 7.

Once you have approval, you can begin!

Research Design Checklist

Before data collection begins, do the following:

  • Write a paragraph explaining your aims and goals (personal/political, practical/strategic, professional/academic).
  • Define your research question; write two to three sentences that clarify the intent of the research and why this is an important question to answer.
  • Review the literature for similar studies that address your research question or similar research questions; think laterally about some literature that might be helpful or illuminating but is not exactly about the same topic.
  • Find a written study that inspires you—it may or may not be on the research question you have chosen.
  • Consider and choose a research tradition and set of data-collection techniques that (1) help answer your research question and (2) match your aims and goals.
  • Define your population of interest and your research setting.
  • Define the criteria for your sample (How many? Why these? How will you find them, gain access, and acquire consent?).
  • If you are conducting interviews, draft an interview guide.
  •  If you are making observations, create a plan for observations (sites, times, recording, access).
  • Acquire any necessary technology (recording devices/software).
  • Draft consent forms that clearly identify the research focus and selection process.
  • Create recruiting materials (posters, email, texts).
  • Apply for IRB approval (proposal plus consent form plus recruiting materials).
  • Block out time for collecting data.
  • At the end of the chapter, you will find a " Research Design Checklist " that summarizes the main recommendations made here ↵
  • For example, if your focus is society and culture , you might collect data through observation or a case study. If your focus is individual lived experience , you are probably going to be interviewing some people. And if your focus is language and communication , you will probably be analyzing text (written or visual). ( Marshall and Rossman 2016:16 ). ↵
  • You may not have any "live" human subjects. There are qualitative research methods that do not require interactions with live human beings - see chapter 16 , "Archival and Historical Sources." But for the most part, you are probably reading this textbook because you are interested in doing research with people. The rest of the chapter will assume this is the case. ↵

One of the primary methodological traditions of inquiry in qualitative research, ethnography is the study of a group or group culture, largely through observational fieldwork supplemented by interviews. It is a form of fieldwork that may include participant-observation data collection. See chapter 14 for a discussion of deep ethnography. 

A methodological tradition of inquiry and research design that focuses on an individual case (e.g., setting, institution, or sometimes an individual) in order to explore its complexity, history, and interactive parts.  As an approach, it is particularly useful for obtaining a deep appreciation of an issue, event, or phenomenon of interest in its particular context.

The controlling force in research; can be understood as lying on a continuum from basic research (knowledge production) to action research (effecting change).

In its most basic sense, a theory is a story we tell about how the world works that can be tested with empirical evidence.  In qualitative research, we use the term in a variety of ways, many of which are different from how they are used by quantitative researchers.  Although some qualitative research can be described as “testing theory,” it is more common to “build theory” from the data using inductive reasoning , as done in Grounded Theory .  There are so-called “grand theories” that seek to integrate a whole series of findings and stories into an overarching paradigm about how the world works, and much smaller theories or concepts about particular processes and relationships.  Theory can even be used to explain particular methodological perspectives or approaches, as in Institutional Ethnography , which is both a way of doing research and a theory about how the world works.

Research that is interested in generating and testing hypotheses about how the world works.

A methodological tradition of inquiry and approach to analyzing qualitative data in which theories emerge from a rigorous and systematic process of induction.  This approach was pioneered by the sociologists Glaser and Strauss (1967).  The elements of theory generated from comparative analysis of data are, first, conceptual categories and their properties and, second, hypotheses or generalized relations among the categories and their properties – “The constant comparing of many groups draws the [researcher’s] attention to their many similarities and differences.  Considering these leads [the researcher] to generate abstract categories and their properties, which, since they emerge from the data, will clearly be important to a theory explaining the kind of behavior under observation.” (36).

An approach to research that is “multimethod in focus, involving an interpretative, naturalistic approach to its subject matter.  This means that qualitative researchers study things in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them.  Qualitative research involves the studied use and collection of a variety of empirical materials – case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts – that describe routine and problematic moments and meanings in individuals’ lives." ( Denzin and Lincoln 2005:2 ). Contrast with quantitative research .

Research that contributes knowledge that will help people to understand the nature of a problem in order to intervene, thereby allowing human beings to more effectively control their environment.

Research that is designed to evaluate or test the effectiveness of specific solutions and programs addressing specific social problems.  There are two kinds: summative and formative .

Research in which an overall judgment about the effectiveness of a program or policy is made, often for the purpose of generalizing to other cases or programs.  Generally uses qualitative research as a supplement to primary quantitative data analyses.  Contrast formative evaluation research .

Research designed to improve a program or policy (to help “form” or shape its effectiveness); relies heavily on qualitative research methods.  Contrast summative evaluation research

Research carried out at a particular organizational or community site with the intention of affecting change; often involves research subjects as participants of the study.  See also participatory action research .

Research in which both researchers and participants work together to understand a problematic situation and change it for the better.

The level of the focus of analysis (e.g., individual people, organizations, programs, neighborhoods).

The large group of interest to the researcher.  Although it will likely be impossible to design a study that incorporates or reaches all members of the population of interest, this should be clearly defined at the outset of a study so that a reasonable sample of the population can be taken.  For example, if one is studying working-class college students, the sample may include twenty such students attending a particular college, while the population is “working-class college students.”  In quantitative research, clearly defining the general population of interest is a necessary step in generalizing results from a sample.  In qualitative research, defining the population is conceptually important for clarity.

A fictional name assigned to give anonymity to a person, group, or place.  Pseudonyms are important ways of protecting the identity of research participants while still providing a “human element” in the presentation of qualitative data.  There are ethical considerations to be made in selecting pseudonyms; some researchers allow research participants to choose their own.

A requirement for research involving human participants; the documentation of informed consent.  In some cases, oral consent or assent may be sufficient, but the default standard is a single-page easy-to-understand form that both the researcher and the participant sign and date.   Under federal guidelines, all researchers "shall seek such consent only under circumstances that provide the prospective subject or the representative sufficient opportunity to consider whether or not to participate and that minimize the possibility of coercion or undue influence. The information that is given to the subject or the representative shall be in language understandable to the subject or the representative.  No informed consent, whether oral or written, may include any exculpatory language through which the subject or the representative is made to waive or appear to waive any of the subject's rights or releases or appears to release the investigator, the sponsor, the institution, or its agents from liability for negligence" (21 CFR 50.20).  Your IRB office will be able to provide a template for use in your study .

An administrative body established to protect the rights and welfare of human research subjects recruited to participate in research activities conducted under the auspices of the institution with which it is affiliated. The IRB is charged with the responsibility of reviewing all research involving human participants. The IRB is concerned with protecting the welfare, rights, and privacy of human subjects. The IRB has the authority to approve, disapprove, monitor, and require modifications in all research activities that fall within its jurisdiction as specified by both the federal regulations and institutional policy.

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

Banner

Qualitative Research Design: Start

Qualitative Research Design

description of qualitative research design

What is Qualitative research design?

Qualitative research is a type of research that explores and provides deeper insights into real-world problems. Instead of collecting numerical data points or intervening or introducing treatments just like in quantitative research, qualitative research helps generate hypotheses as well as further investigate and understand quantitative data. Qualitative research gathers participants' experiences, perceptions, and behavior. It answers the hows and whys instead of how many or how much . It could be structured as a stand-alone study, purely relying on qualitative data or it could be part of mixed-methods research that combines qualitative and quantitative data.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and analyzing numerical data for statistical analysis. Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

While qualitative and quantitative approaches are different, they are not necessarily opposites, and they are certainly not mutually exclusive. For instance, qualitative research can help expand and deepen understanding of data or results obtained from quantitative analysis. For example, say a quantitative analysis has determined that there is a correlation between length of stay and level of patient satisfaction, but why does this correlation exist? This dual-focus scenario shows one way in which qualitative and quantitative research could be integrated together.

Research Paradigms 

  • Positivist versus Post-Positivist
  • Social Constructivist (this paradigm/ideology mostly birth qualitative studies)

Events Relating to the Qualitative Research and Community Engagement Workshops @ CMU Libraries

CMU Libraries is committed to helping members of our community become data experts. To that end, CMU is offering public facing workshops that discuss Qualitative Research, Coding, and Community Engagement best practices.

The following workshops are a part of a broader series on using data. Please follow the links to register for the events. 

Qualitative Coding

Using Community Data to improve Outcome (Grant Writing)

Survey Design  

Upcoming Event: March 21st, 2024 (12:00pm -1:00 pm)

Community Engagement and Collaboration Event 

Join us for an event to improve, build on and expand the connections between Carnegie Mellon University resources and the Pittsburgh community. CMU resources such as the Libraries and Sustainability Initiative can be leveraged by users not affiliated with the university, but barriers can prevent them from fully engaging.

The conversation features representatives from CMU departments and local organizations about the community engagement efforts currently underway at CMU and opportunities to improve upon them. Speakers will highlight current and ongoing projects and share resources to support future collaboration.

Event Moderators:

Taiwo Lasisi, CLIR Postdoctoral Fellow in Community Data Literacy,  Carnegie Mellon University Libraries

Emma Slayton, Data Curation, Visualization, & GIS Specialist,  Carnegie Mellon University Libraries

Nicky Agate , Associate Dean for Academic Engagement, Carnegie Mellon University Libraries

Chelsea Cohen , The University’s Executive fellow for community engagement, Carnegie Mellon University

Sarah Ceurvorst , Academic Pathways Manager, Program Director, LEAP (Leadership, Excellence, Access, Persistence) Carnegie Mellon University

Julia Poeppibg , Associate Director of Partnership Development, Information Systems, Carnegie Mellon University 

Scott Wolovich , Director of New Sun Rising, Pittsburgh 

Additional workshops and events will be forthcoming. Watch this space for updates. 

Workshop Organizer

Profile Photo

Qualitative Research Methods

What are Qualitative Research methods?

Qualitative research adopts numerous methods or techniques including interviews, focus groups, and observation. Interviews may be unstructured, with open-ended questions on a topic and the interviewer adapts to the responses. Structured interviews have a predetermined number of questions that every participant is asked. It is usually one-on-one and is appropriate for sensitive topics or topics needing an in-depth exploration. Focus groups are often held with 8-12 target participants and are used when group dynamics and collective views on a topic are desired. Researchers can be participant observers to share the experiences of the subject or non-participant or detached observers.

What constitutes a good research question? Does the question drive research design choices?

According to Doody and Bailey (2014);

 We can only develop a good research question by consulting relevant literature, colleagues, and supervisors experienced in the area of research. (inductive interactions).

Helps to have a directed research aim and objective.

Researchers should not be “ research trendy” and have enough evidence. This is why research objectives are important. It helps to take time, and resources into consideration.

Research questions can be developed from theoretical knowledge, previous research or experience, or a practical need at work (Parahoo 2014). They have numerous roles, such as identifying the importance of the research and providing clarity of purpose for the research, in terms of what the research intends to achieve in the end.

Qualitative Research Questions

What constitutes a good Qualitative research question?

A good qualitative question answers the hows and whys instead of how many or how much. It could be structured as a stand-alone study, purely relying on qualitative data or it could be part of mixed-methods research that combines qualitative and quantitative data. Qualitative research gathers participants' experiences, perceptions and behavior.

Examples of good Qualitative Research Questions:

What are people's thoughts on the new library? 

How does it feel to be a first-generation student attending college?

Difference example (between Qualitative and Quantitative research questions):

How many college students signed up for the new semester? (Quan) 

How do college students feel about the new semester? What are their experiences so far? (Qual)

  • Qualitative Research Design Workshop Powerpoint

Foley G, Timonen V. Using Grounded Theory Method to Capture and Analyze Health Care Experiences. Health Serv Res. 2015 Aug;50(4):1195-210. [ PMC free article: PMC4545354 ] [ PubMed: 25523315 ]

Devers KJ. How will we know "good" qualitative research when we see it? Beginning the dialogue in health services research. Health Serv Res. 1999 Dec;34(5 Pt 2):1153-88. [ PMC free article: PMC1089058 ] [ PubMed: 10591278 ]

Huston P, Rowan M. Qualitative studies. Their role in medical research. Can Fam Physician. 1998 Nov;44:2453-8. [ PMC free article: PMC2277956 ] [ PubMed: 9839063 ]

Corner EJ, Murray EJ, Brett SJ. Qualitative, grounded theory exploration of patients' experience of early mobilisation, rehabilitation and recovery after critical illness. BMJ Open. 2019 Feb 24;9(2):e026348. [ PMC free article: PMC6443050 ] [ PubMed: 30804034 ]

Moser A, Korstjens I. Series: Practical guidance to qualitative research. Part 3: Sampling, data collection and analysis. Eur J Gen Pract. 2018 Dec;24(1):9-18. [ PMC free article: PMC5774281 ] [ PubMed: 29199486 ]

Houghton C, Murphy K, Meehan B, Thomas J, Brooker D, Casey D. From screening to synthesis: using nvivo to enhance transparency in qualitative evidence synthesis. J Clin Nurs. 2017 Mar;26(5-6):873-881. [ PubMed: 27324875 ]

Soratto J, Pires DEP, Friese S. Thematic content analysis using ATLAS.ti software: Potentialities for researchs in health. Rev Bras Enferm. 2020;73(3):e20190250. [ PubMed: 32321144 ]

Zamawe FC. The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections. Malawi Med J. 2015 Mar;27(1):13-5. [ PMC free article: PMC4478399 ] [ PubMed: 26137192 ]

Korstjens I, Moser A. Series: Practical guidance to qualitative research. Part 4: Trustworthiness and publishing. Eur J Gen Pract. 2018 Dec;24(1):120-124. [ PMC free article: PMC8816392 ] [ PubMed: 29202616 ]

Saldaña, J. (2021). The coding manual for qualitative researchers. The coding manual for qualitative researchers, 1-440.

O'Brien BC, Harris IB, Beckman TJ, Reed DA, Cook DA. Standards for reporting qualitative research: a synthesis of recommendations. Acad Med. 2014 Sep;89(9):1245-51. [ PubMed: 24979285 ]

Palermo C, King O, Brock T, Brown T, Crampton P, Hall H, Macaulay J, Morphet J, Mundy M, Oliaro L, Paynter S, Williams B, Wright C, E Rees C. Setting priorities for health education research: A mixed methods study. Med Teach. 2019 Sep;41(9):1029-1038. [ PubMed: 31141390 ]

  • Last Updated: Feb 14, 2024 4:25 PM
  • URL: https://guides.library.cmu.edu/c.php?g=1346006
  • Tools and Resources
  • Customer Services
  • Original Language Spotlight
  • Alternative and Non-formal Education 
  • Cognition, Emotion, and Learning
  • Curriculum and Pedagogy
  • Education and Society
  • Education, Change, and Development
  • Education, Cultures, and Ethnicities
  • Education, Gender, and Sexualities
  • Education, Health, and Social Services
  • Educational Administration and Leadership
  • Educational History
  • Educational Politics and Policy
  • Educational Purposes and Ideals
  • Educational Systems
  • Educational Theories and Philosophies
  • Globalization, Economics, and Education
  • Languages and Literacies
  • Professional Learning and Development
  • Research and Assessment Methods
  • Technology and Education
  • Share This Facebook LinkedIn Twitter

Article contents

Qualitative design research methods.

  • Michael Domínguez Michael Domínguez San Diego State University
  • https://doi.org/10.1093/acrefore/9780190264093.013.170
  • Published online: 19 December 2017

Emerging in the learning sciences field in the early 1990s, qualitative design-based research (DBR) is a relatively new methodological approach to social science and education research. As its name implies, DBR is focused on the design of educational innovations, and the testing of these innovations in the complex and interconnected venue of naturalistic settings. As such, DBR is an explicitly interventionist approach to conducting research, situating the researcher as a part of the complex ecology in which learning and educational innovation takes place.

With this in mind, DBR is distinct from more traditional methodologies, including laboratory experiments, ethnographic research, and large-scale implementation. Rather, the goal of DBR is not to prove the merits of any particular intervention, or to reflect passively on a context in which learning occurs, but to examine the practical application of theories of learning themselves in specific, situated contexts. By designing purposeful, naturalistic, and sustainable educational ecologies, researchers can test, extend, or modify their theories and innovations based on their pragmatic viability. This process offers the prospect of generating theory-developing, contextualized knowledge claims that can complement the claims produced by other forms of research.

Because of this interventionist, naturalistic stance, DBR has also been the subject of ongoing debate concerning the rigor of its methodology. In many ways, these debates obscure the varied ways DBR has been practiced, the varied types of questions being asked, and the theoretical breadth of researchers who practice DBR. With this in mind, DBR research may involve a diverse range of methods as researchers from a variety of intellectual traditions within the learning sciences and education research design pragmatic innovations based on their theories of learning, and document these complex ecologies using the methodologies and tools most applicable to their questions, focuses, and academic communities.

DBR has gained increasing interest in recent years. While it remains a popular methodology for developmental and cognitive learning scientists seeking to explore theory in naturalistic settings, it has also grown in importance to cultural psychology and cultural studies researchers as a methodological approach that aligns in important ways with the participatory commitments of liberatory research. As such, internal tension within the DBR field has also emerged. Yet, though approaches vary, and have distinct genealogies and commitments, DBR might be seen as the broad methodological genre in which Change Laboratory, design-based implementation research (DBIR), social design-based experiments (SDBE), participatory design research (PDR), and research-practice partnerships might be categorized. These critically oriented iterations of DBR have important implications for educational research and educational innovation in historically marginalized settings and the Global South.

  • design-based research
  • learning sciences
  • social-design experiment
  • qualitative research
  • research methods

Educational research, perhaps more than many other disciplines, is a situated field of study. Learning happens around us every day, at all times, in both formal and informal settings. Our worlds are replete with complex, dynamic, diverse communities, contexts, and institutions, many of which are actively seeking guidance and support in the endless quest for educational innovation. Educational researchers—as a source of potential expertise—are necessarily implicated in this complexity, linked to the communities and institutions through their very presence in spaces of learning, poised to contribute with possible solutions, yet often positioned as separate from the activities they observe, creating dilemmas of responsibility and engagement.

So what are educational scholars and researchers to do? These tensions invite a unique methodological challenge for the contextually invested researcher, begging them to not just produce knowledge about learning, but to participate in the ecology, collaborating on innovations in the complex contexts in which learning is taking place. In short, for many educational researchers, our backgrounds as educators, our connections to community partners, and our sociopolitical commitments to the process of educational innovation push us to ensure that our work is generative, and that our theories and ideas—our expertise—about learning and education are made pragmatic, actionable, and sustainable. We want to test what we know outside of laboratories, designing, supporting, and guiding educational innovation to see if our theories of learning are accurate, and useful to the challenges faced in schools and communities where learning is messy, collaborative, and contested. Through such a process, we learn, and can modify our theories to better serve the real needs of communities. It is from this impulse that qualitative design-based research (DBR) emerged as a new methodological paradigm for education research.

Qualitative design-based research will be examined, documenting its origins, the major tenets of the genre, implementation considerations, and methodological issues, as well as variance within the paradigm. As a relatively new methodology, much tension remains in what constitutes DBR, and what design should mean, and for whom. These tensions and questions, as well as broad perspectives and emergent iterations of the methodology, will be discussed, and considerations for researchers looking toward the future of this paradigm will be considered.

The Origins of Design-Based Research

Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown ( 1992 ) and Allan Collins ( 1992 ). For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory experiments, ethnographies, and large-scale educational interventions were the only methods available. During these decades, a growing community of learning science and educational researchers (e.g., Bereiter & Scardamalia, 1989 ; Brown, Campione, Webber, & McGilley, 1992 ; Cobb & Steffe, 1983 ; Cole, 1995 ; Scardamalia & Bereiter, 1991 ; Schoenfeld, 1982 , 1985 ; Scribner & Cole, 1978 ) interested in educational innovation and classroom interventions in situated contexts began to find the prevailing methodologies insufficient for the types of learning they wished to document, the roles they wished to play in research, and the kinds of knowledge claims they wished to explore. The laboratory, or laboratory-like settings, where research on learning was at the time happening, was divorced from the complexity of real life, and necessarily limiting. Alternatively, most ethnographic research, while more attuned to capturing these complexities and dynamics, regularly assumed a passive stance 1 and avoided interceding in the learning process, or allowing researchers to see what possibility for innovation existed from enacting nascent learning theories. Finally, large-scale interventions could test innovations in practice but lost sight of the nuance of development and implementation in local contexts (Brown, 1992 ; Collins, Joseph, & Bielaczyc, 2004 ).

Dissatisfied with these options, and recognizing that in order to study and understand learning in the messiness of socially, culturally, and historically situated settings, new methods were required, Brown ( 1992 ) proposed an alternative: Why not involve ourselves in the messiness of the process, taking an active, grounded role in disseminating our theories and expertise by becoming designers and implementers of educational innovations? Rather than observing from afar, DBR researchers could trace their own iterative processes of design, implementation, tinkering, redesign, and evaluation, as it unfolded in shared work with teachers, students, learners, and other partners in lived contexts. This premise, initially articulated as “design experiments” (Brown, 1992 ), would be variously discussed over the next decade as “design research,” (Edelson, 2002 ) “developmental research,” (Gravemeijer, 1994 ), and “design-based research,” (Design-Based Research Collective, 2003 ), all of which reflect the original, interventionist, design-oriented concept. The latter term, “design-based research” (DBR), is used here, recognizing this as the prevailing terminology used to refer to this research approach at present. 2

Regardless of the evolving moniker, the prospects of such a methodology were extremely attractive to researchers. Learning scientists acutely aware of various aspects of situated context, and interested in studying the applied outcomes of learning theories—a task of inquiry into situated learning for which canonical methods were rather insufficient—found DBR a welcome development (Bell, 2004 ). As Barab and Squire ( 2004 ) explain: “learning scientists . . . found that they must develop technological tools, curriculum, and especially theories that help them systematically understand and predict how learning occurs” (p. 2), and DBR methodologies allowed them to do this in proactive, hands-on ways. Thus, rather than emerging as a strict alternative to more traditional methodologies, DBR was proposed to fill a niche that other methodologies were ill-equipped to cover.

Effectively, while its development is indeed linked to an inherent critique of previous research paradigms, neither Brown nor Collins saw DBR in opposition to other forms of research. Rather, by providing a bridge from the laboratory to the real world, where learning theories and proposed innovations could interact and be implemented in the complexity of lived socio-ecological contexts (Hoadley, 2004 ), new possibilities emerged. Learning researchers might “trace the evolution of learning in complex, messy classrooms and schools, test and build theories of teaching and learning, and produce instructional tools that survive the challenges of everyday practice” (Shavelson, Phillips, Towne, & Feuer, 2003 , p. 25). Thus, DBR could complement the findings of laboratory, ethnographic, and large-scale studies, answering important questions about the implementation, sustainability, limitations, and usefulness of theories, interventions, and learning when introduced as innovative designs into situated contexts of learning. Moreover, while studies involving these traditional methodologies often concluded by pointing toward implications—insights subsequent studies would need to take up—DBR allowed researchers to address implications iteratively and directly. No subsequent research was necessary, as emerging implications could be reflexively explored in the context of the initial design, offering considerable insight into how research is translated into theory and practice.

Since its emergence in 1992 , DBR as a methodological approach to educational and learning research has quickly grown and evolved, used by researchers from a variety of intellectual traditions in the learning sciences, including developmental and cognitive psychology (e.g., Brown & Campione, 1996 , 1998 ; diSessa & Minstrell, 1998 ), cultural psychology (e.g., Cole, 1996 , 2007 ; Newman, Griffin, & Cole, 1989 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ), cultural anthropology (e.g., Barab, Kinster, Moore, Cunningham, & the ILF Design Team, 2001 ; Polman, 2000 ; Stevens, 2000 ; Suchman, 1995 ), and cultural-historical activity theory (e.g., Engeström, 2011 ; Espinoza, 2009 ; Espinoza & Vossoughi, 2014 ; Gutiérrez, 2008 ; Sannino, 2011 ). Given this plurality of epistemological and theoretical fields that employ DBR, it might best be understood as a broad methodology of educational research, realized in many different, contested, heterogeneous, and distinct iterations, and engaging a variety of qualitative tools and methods (Bell, 2004 ). Despite tensions among these iterations, and substantial and important variances in the ways they employ design-as-research in community settings, there are several common, methodological threads that unite the broad array of research that might be classified as DBR under a shared, though pluralistic, paradigmatic umbrella.

The Tenets of Design-Based Research

Why design-based research.

As we turn to the core tenets of the design-based research (DBR) paradigm, it is worth considering an obvious question: Why use DBR as a methodology for educational research? To answer this, it is helpful to reflect on the original intentions for DBR, particularly, that it is not simply the study of a particular, isolated intervention. Rather, DBR methodologies were conceived of as the complete, iterative process of designing, modifying, and assessing the impact of an educational innovation in a contextual, situated learning environment (Barab & Kirshner, 2001 ; Brown, 1992 ; Cole & Engeström, 2007 ). The design process itself—inclusive of the theory of learning employed, the relationships among participants, contextual factors and constraints, the pedagogical approach, any particular intervention, as well as any changes made to various aspects of this broad design as it proceeds—is what is under study.

Considering this, DBR offers a compelling framework for the researcher interested in having an active and collaborative hand in designing for educational innovation, and interested in creating knowledge about how particular theories of learning, pedagogical or learning practices, or social arrangements function in a context of learning. It is a methodology that can put the researcher in the position of engineer , actively experimenting with aspects of learning and sociopolitical ecologies to arrive at new knowledge and productive outcomes, as Cobb, Confrey, diSessa, Lehrer, and Schauble ( 2003 ) explain:

Prototypically, design experiments entail both “engineering” particular forms of learning and systematically studying those forms of learning within the context defined by the means of supporting them. This designed context is subject to test and revision, and the successive iterations that result play a role similar to that of systematic variation in experiment. (p. 9)

This being said, how directive the engineering role the researcher takes on varies considerably among iterations of DBR. Indeed, recent approaches have argued strongly for researchers to take on more egalitarian positionalities with respect to the community partners with whom they work (e.g., Zavala, 2016 ), acting as collaborative designers, rather than authoritative engineers.

Method and Methodology in Design-Based Research

Now, having established why we might use DBR, a recurring question that has faced the DBR paradigm is whether DBR is a methodology at all. Given the variety of intellectual and ontological traditions that employ it, and thus the pluralism of methods used in DBR to enact the “engineering” role (whatever shape that may take) that the researcher assumes, it has been argued that DBR is not, in actuality a methodology at all (Kelly, 2004 ). The proliferation and diversity of approaches, methods, and types of analysis purporting to be DBR have been described as a lack of coherence that shows there is no “argumentative grammar” or methodology present in DBR (Kelly, 2004 ).

Now, the conclusions one will eventually draw in this debate will depend on one’s orientations and commitments, but it is useful to note that these demands for “coherence” emerge from previous paradigms in which methodology was largely marked by a shared, coherent toolkit for data collection and data analysis. These previous paradigmatic rules make for an odd fit when considering DBR. Yet, even if we proceed—within the qualitative tradition from which DBR emerges—defining methodology as an approach to research that is shaped by the ontological and epistemological commitments of the particular researcher, and methods as the tools for research, data collection, and analysis that are chosen by the researcher with respect to said commitments (Gutiérrez, Engeström, & Sannino, 2016 ), then a compelling case for DBR as a methodology can be made (Bell, 2004 ).

Effectively, despite the considerable variation in how DBR has been and is employed, and tensions within the DBR field, we might point to considerable, shared epistemic common ground among DBR researchers, all of whom are invested in an approach to research that involves engaging actively and iteratively in the design and exploration of learning theory in situated, natural contexts. This common epistemic ground, even in the face of pluralistic ideologies and choices of methods, invites in a new type of methodological coherence, marked by “intersubjectivity without agreement” (Matusov, 1996 ), that links DBR from traditional developmental and cognitive psychology models of DBR (e.g., Brown, 1992 ; Brown & Campione, 1998 ; Collins, 1992 ), to more recent critical and sociocultural manifestations (e.g., Bang & Vossoughi, 2016 ; Engeström, 2011 ; Gutiérrez, 2016 ), and everything in between.

Put in other terms, even as DBR researchers may choose heterogeneous methods for data collection, data analysis, and reporting results complementary to the ideological and sociopolitical commitments of the particular researcher and the types of research questions that are under examination (Bell, 2004 ), a shared epistemic commitment gives the methodology shape. Indeed, the common commitment toward design innovation emerges clearly across examples of DBR methodological studies ranging in method from ethnographic analyses (Salvador, Bell, & Anderson, 1999 ) to studies of critical discourse within a design (Kärkkäinen, 1999 ), to focused examinations of metacognition of individual learners (White & Frederiksen, 1998 ), and beyond. Rather than indicating a lack of methodology, or methodological weakness, the use of varying qualitative methods for framing data collection and retrospective analyses within DBR, and the tensions within the epistemic common ground itself, simply reflects the scope of its utility. Learning in context is complex, contested, and messy, and the plurality of methods present across DBR allow researchers to dynamically respond to context as needed, employing the tools that fit best to consider the questions that are present, or may arise.

All this being the case, it is useful to look toward the coherent elements—the “argumentative grammar” of DBR, if you will—that can be identified across the varied iterations of DBR. Understanding these shared features, in the context and terms of the methodology itself, help us to appreciate what is involved in developing robust and thorough DBR research, and how DBR seeks to make strong, meaningful claims around the types of research questions it takes up.

Coherent Features of Design-Based Research

Several scholars have provided comprehensive overviews and listings of what they see as the cross-cutting features of DBR, both in the context of more traditional models of DBR (e.g., Cobb et al., 2003 ; Design-Based Research Collective, 2003 ), and in regards to newer iterations (e.g., Gutiérrez & Jurow, 2016 ; Bang & Vossoughi, 2016 ). Rather than try to offer an overview of each of these increasingly pluralistic classifications, the intent here is to attend to three broad elements that are shared across articulations of DBR and reflect the essential elements that constitute the methodological approach DBR offers to educational researchers.

Design research is concerned with the development, testing, and evolution of learning theory in situated contexts

This first element is perhaps most central to what DBR of all types is, anchored in what Brown ( 1992 ) was initially most interested in: testing the pragmatic validity of theories of learning by designing interventions that engaged with, or proposed, entire, naturalistic, ecologies of learning. Put another way, while DBR studies may have various units of analysis, focuses, and variables, and may organize learning in many different ways, it is the theoretically informed design for educational innovation that is most centrally under evaluation. DBR actively and centrally exists as a paradigm that is engaged in the development of theory, not just the evaluation of aspects of its usage (Bell, 2004 ; Design-Based Research Collective, 2003 ; Lesh & Kelly, 2000 ; van den Akker, 1999 ).

Effectively, where DBR is taking place, theory as a lived possibility is under examination. Specifically, in most DBR, this means a focus on “intermediate-level” theories of learning, rather than “grand” ones. In essence, DBR does not contend directly with “grand” learning theories (such as developmental or sociocultural theory writ large) (diSessa, 1991 ). Rather, DBR seeks to offer constructive insights by directly engaging with particular learning processes that flow from these theories on a “grounded,” “intermediate” level. This is not, however, to say DBR is limited in what knowledge it can produce; rather, tinkering in this “intermediate” realm can produce knowledge that informs the “grand” theory (Gravemeijer, 1994 ). For example, while cognitive and motivational psychology provide “grand” theoretical frames, interest-driven learning (IDL) is an “intermediate” theory that flows from these and can be explored in DBR to both inform the development of IDL designs in practice and inform cognitive and motivational psychology more broadly (Joseph, 2004 ).

Crucially, however, DBR entails putting the theory in question under intense scrutiny, or, “into harm’s way” (Cobb et al., 2003 ). This is an especially core element to DBR, and one that distinguishes it from the proliferation of educational-reform or educational-entrepreneurship efforts that similarly take up the discourse of “design” and “innovation.” Not only is the reflexive, often participatory element of DBR absent from such efforts—that is, questioning and modifying the design to suit the learning needs of the context and partners—but the theory driving these efforts is never in question, and in many cases, may be actively obscured. Indeed, it is more common to see educational-entrepreneur design innovations seek to modify a context—such as the way charter schools engage in selective pupil recruitment and intensive disciplinary practices (e.g., Carnoy et al., 2005 ; Ravitch, 2010 ; Saltman, 2007 )—rather than modify their design itself, and thus allow for humility in their theory. Such “innovations” and “design” efforts are distinct from DBR, which must, in the spirit of scientific inquiry, be willing to see the learning theory flail and struggle, be modified, and evolve.

This growth and evolution of theory and knowledge is of course central to DBR as a rigorous research paradigm; moving it beyond simply the design of local educational programs, interventions, or innovations. As Barab and Squire ( 2004 ) explain:

Design-based research requires more than simply showing a particular design works but demands that the researcher (move beyond a particular design exemplar to) generate evidence-based claims about learning that address contemporary theoretical issues and further the theoretical knowledge of the field. (pp. 5–6)

DBR as a research paradigm offers a design process through which theories of learning can be tested; they can be modified, and by allowing them to operate with humility in situated conditions, new insights and knowledge, even new theories, may emerge that might inform the field, as well as the efforts and directions of other types of research inquiry. These productive, theory-developing outcomes, or “ontological innovations” (diSessa & Cobb, 2004 ), represent the culmination of an effective program of DBR—the production of new ways to understand, conceptualize, and enact learning as a lived, contextual process.

Design research works to understand learning processes, and the design that supports them in situated contexts

As a research methodology that operates by tinkering with “grounded” learning theories, DBR is itself grounded, and seeks to develop its knowledge claims and designs in naturalistic, situated contexts (Brown, 1992 ). This is, again, a distinguishing element of DBR—setting it apart from laboratory research efforts involving design and interventions in closed, controlled environments. Rather than attempting to focus on singular variables, and isolate these from others, DBR is concerned with the multitude of variables that naturally occur across entire learning ecologies, and present themselves in distinct ways across multiple planes of possible examination (Rogoff, 1995 ; Collins, Joseph, & Bielaczyc, 2004 ). Certainly, specific variables may be identified as dependent, focal units of analysis, but identifying (while not controlling for) the variables beyond these, and analyzing their impact on the design and learning outcomes, is an equally important process in DBR (Collins et al., 2004 ; Barab & Kirshner, 2001 ). In practice, this of course varies across iterations in its depth and breadth. Traditional models of developmental or cognitive DBR may look to account for the complexity and nuance of a setting’s social, developmental, institutional, and intellectual characteristics (e.g., Brown, 1992 ; Cobb et al., 2003 ), while more recent, critical iterations will give increased attention to how historicity, power, intersubjectivity, and culture, among other things, influence and shape a setting, and the learning that occurs within it (e.g., Gutiérrez, 2016 ; Vakil, de Royston, Nasir, & Kirshner, 2016 ).

Beyond these variations, what counts as “design” in DBR varies widely, and so too will what counts as a naturalistic setting. It has been well documented that learning occurs all the time, every day, and in every space imaginable, both formal and informal (Leander, Phillips, & Taylor, 2010 ), and in ways that span strictly defined setting boundaries (Engeström, Engeström, & Kärkkäinen, 1995 ). DBR may take place in any number of contexts, based on the types of questions asked, and the learning theories and processes that a researcher may be interested in exploring. DBR may involve one-to-one tutoring and learning settings, single classrooms, community spaces, entire institutions, or even holistically designed ecologies (Design-Based Research Collective, 2003 ; Engeström, 2008 ; Virkkunen & Newnham, 2013 ). In all these cases, even the most completely designed experimental ecology, the setting remains naturalistic and situated because DBR actively embraces the uncontrollable variables that participants bring with them to the learning process for and from their situated worlds, lives, and experiences—no effort is made to control for these complicated influences of life, simply to understand how they operate in a given ecology as innovation is attempted. Thus, the extent of the design reflects a broader range of qualitative and theoretical study, rather than an attempt to control or isolate some particular learning process from outside influence.

While there is much variety in what design may entail, where DBR takes place, what types of learning ecologies are under examination, and what methods are used, situated ecologies are always the setting of this work. In this way, conscious of naturalistic variables, and the influences that culture, historicity, participation, and context have on learning, researchers can use DBR to build on prior research, and extend knowledge around the learning that occurs in the complexity of situated contexts and lived practices (Collins et al., 2004 ).

Design based research is iterative; it changes, grows, and evolves to meet the needs and emergent questions of the context, and this tinkering process is part of the research

The final shared element undergirding models of DBR is that it is an iterative, active, and interventionist process, interested in and focused on producing educational innovation by actually and actively putting design innovations into practice (Brown, 1992 , Collins, 1992 ; Gutiérrez, 2008 ). Given this interventionist, active stance, tinkering with the design and the theory of learning informing the design is as much a part of the research process as the outcome of the intervention or innovation itself—we learn what impacts learning as much, if not more, than we learn what was learned. In this sense, DBR involves a focus on analyzing the theory-driven design itself, and its implementation as an object of study (Edelson, 2002 ; Penuel, Fishman, Cheng, & Sabelli, 2011 ), and is ultimately interested in the improvement of the design—of how it unfolds, how it shifts, how it is modified, and made to function productively for participants in their contexts and given their needs (Kirshner & Polman, 2013 ).

While DBR is iterative and contextual as a foundational methodological principle, what this means varies across conceptions of DBR. For instance, in more traditional models, Brown and Campione ( 1996 ) pointed out the dangers of “lethal mutation” in which a design, introduced into a context, may become so warped by the influence, pressures, incomplete implementation, or misunderstanding of participants in the local context, that it no longer reflects or tests the theory under study. In short, a theory-driven intervention may be put in place, and then subsumed to such a degree by participants based on their understanding and needs, that it remains the original innovative design in name alone. The assertion here is that in these cases, the research ceases to be DBR in the sense that the design is no longer central, actively shaping learning. We cannot, they argue, analyze a design—and the theory it was meant to reflect—as an object of study when it has been “mutated,” and it is merely a banner under which participants are enacting their idiosyncratic, pragmatic needs.

While the ways in which settings and individuals might disrupt designs intended to produce robust learning is certainly a tension to be cautious of in DBR, it is also worth noting that in many critical approaches to DBR, such mutations—whether “lethal” to the original design or not—are seen as compelling and important moments. Here, where collaboration and community input is more central to the design process, iterative is understood differently. Thus, a “mutation” becomes a point where reflexivity, tension, and contradiction might open the door for change, for new designs, for reconsiderations of researcher and collaborative partner positionalities, or for ethnographic exploration into how a context takes up, shapes, and ultimately engages innovations in a particular sociocultural setting. In short, accounting for and documenting changes in design is a vital part of the DBR process, allowing researchers to respond to context in a variety of ways, always striving for their theories and designs to act with humility, and in the interest of usefulness .

With this in mind, the iterative nature of DBR means that the relationships researchers have with other design partners (educators and learners) in the ecology are incredibly important, and vital to consider (Bang et al., 2016 ; Engeström, 2007 ; Engeström, Sannino, & Virkkunen, 2014 ). Different iterations of DBR might occur in ways in which the researcher is more or less intimately involved in the design and implementation process, both in terms of actual presence and intellectual ownership of the design. Regarding the former, in some cases, a researcher may hand a design off to others to implement, periodically studying and modifying it, while in other contexts or designs, the researcher may be actively involved, tinkering in every detail of the implementation and enactment of the design. With regard to the latter, DBR might similarly range from a somewhat prescribed model, in which the researcher is responsible for the original design, and any modifications that may occur based on their analyses, without significant input from participants (e.g., Collins et al., 2004 ), to incredibly participatory models, in which all parties (researchers, educators, learners) are part of each step of the design-creation, modification, and research process (e.g., Bang, Faber, Gurneau, Marin, & Soto, 2016 ; Kirshner, 2015 ).

Considering the wide range of ideological approaches and models for DBR, we might acknowledge that DBR can be gainfully conducted through many iterations of “openness” to the design process. However, the strength of the research—focused on analyzing the design itself as a unit of study reflective of learning theory—will be bolstered by thoughtfully accounting for how involved the researcher will be, and how open to participation the modification process is. These answers should match the types of questions, and conceptual or ideological framing, with which researchers approach DBR, allowing them to tinker with the process of learning as they build on prior research to extend knowledge and test theory (Barab & Kirshner, 2001 ), while thoughtfully documenting these changes in the design as they go.

Implementation and Research Design

As with the overarching principles of design-based research (DBR), even amid the pluralism of conceptual frameworks of DBR researchers, it is possible, and useful, to trace the shared contours in how DBR research design is implemented. Though texts provide particular road maps for undertaking various iterations of DBR consistent with the specific goals, types of questions, and ideological orientations of these scholarly communities (e.g., Cole & Engeström, 2007 ; Collins, Joseph, & Bielaczyc, 2004 ; Fishman, Penuel, Allen, Cheng, & Sabelli, 2013 ; Gutiérrez & Jurow, 2016 ; Virkkunen & Newnham, 2013 ), certain elements, realized differently, can be found across all of these models, and may be encapsulated in five broad methodological phases.

Considering the Design Focus

DBR begins by considering what the focus of the design, the situated context, and the units of analysis for research will be. Prospective DBR researchers will need to consider broader research in regard to the “grand” theory of learning with which they work to determine what theoretical questions they have, or identify “intermediate” aspects of the theories that might be studied and strengthened by a design process in situated contexts, and what planes of analysis (Rogoff, 1995 ) will be most suitable for examination. This process allows for the identification of the critical theoretical elements of a design, and articulation of initial research questions.

Given the conceptual framework, theoretical and research questions, and sociopolitical interests at play, researchers may undertake this, and subsequent steps in the process, on their own, or in close collaboration with the communities and individuals in the situated contexts in which the design will unfold. As such, across iterations of DBR, and with respect to the ways DBR researchers choose to engage with communities, the origin of the design will vary, and might begin in some cases with theoretical questions, or arise in others as a problem of practice (Coburn & Penuel, 2016 ), though as has been noted, in either case, theory and practice are necessarily linked in the research.

Creating and Implementing a Designed Innovation

From the consideration and identification of the critical elements, planned units of analysis, and research questions that will drive a design, researchers can then actively create (either on their own or in conjunction with potential design partners) a designed intervention reflecting these critical elements, and the overarching theory.

Here, the DBR researcher should consider what partners exist in the process and what ownership exists around these partnerships, determine exactly what the pragmatic features of the intervention/design will be and who will be responsible for them, and consider when checkpoints for modification and evaluation will be undertaken, and by whom. Additionally, researchers should at this stage consider questions of timeline and of recruiting participants, as well as what research materials will be needed to adequately document the design, its implementation, and its outcomes, and how and where collected data will be stored.

Once a design (the planned, theory-informed innovative intervention) has been produced, the DBR researcher and partners can begin the implementation process, putting the design into place and beginning data collection and documentation.

Assessing the Impact of the Design on the Learning Ecology

Chronologically, the next two methodological steps happen recursively in the iterative process of DBR. The researcher must assess the impact of the design, and then, make modifications as necessary, before continuing to assess the impact of these modifications. In short, these next two steps are a cycle that continues across the life and length of the research design.

Once a design has been created and implemented, the researcher begins to observe and document the learning, the ecology, and the design itself. Guided by and in conversation with the theory and critical elements, the researcher should periodically engage in ongoing data analysis, assessing the success of the design, and of learning, paying equal attention to the design itself, and how its implementation is working in the situated ecology.

Within the realm of qualitative research, measuring or assessing variables of learning and assessing the design may look vastly different, require vastly different data-collection and data-analysis tools, and involve vastly different research methods among different researchers.

Modifying the Design

Modification, based on ongoing assessment of the design, is what makes DBR iterative, helping the researcher extend the field’s knowledge about the theory, design, learning, and the context under examination.

Modification of the design can take many forms, from complete changes in approach or curriculum, to introducing an additional tool or mediating artifact into a learning ecology. Moreover, how modification unfolds involves careful reflection from the researcher and any co-designing participants, deciding whether modification will be an ongoing, reflexive, tinkering process, or if it will occur only at predefined checkpoints, after formal evaluation and assessment. Questions of ownership, issues of resource availability, technical support, feasibility, and communication are all central to the work of design modification, and answers will vary given the research questions, design parameters, and researchers’ epistemic commitments.

Each moment of modification indicates a new phase in a DBR project, and a new round of assessing—through data analysis—the impact of the design on the learning ecology, either to guide continued or further modification, report the results of the design, or in some cases, both.

Reporting the Results of the Design

The final step in DBR methodology is to report on the results of the designed intervention, how it contributed to understandings of theory, and how it impacted the local learning ecology or context. The format, genre, and final data analysis methods used in reporting data and research results will vary across iterations of DBR. However, it is largely understood that to avoid methodological confusion, DBR researchers should clearly situate themselves in the DBR paradigm by clearly describing and detailing the design itself; articulating the theory, central elements, and units of analysis under scrutiny, what modifications occurred and what precipitated these changes, and what local effects were observed; and exploring any potential contributions to learning theory, while accounting for the context and their interventionist role and positionality in the design. As such, careful documentation of pragmatic and design decisions for retrospective data analysis, as well as research findings, should be done at each stage of this implementation process.

Methodological Issues in the Design-Based Research Paradigm

Because of its pluralistic nature, its interventionist, nontraditional stance, and the fact that it remains in its conceptual infancy, design-based research (DBR) is replete with ongoing methodological questions and challenges, both from external and internal sources. While there are many more that may exist, addressed will be several of the most pressing the prospective DBR researcher may encounter, or want to consider in understanding the paradigm and beginning a research design.

Challenges to Rigor and Validity

Perhaps the place to begin this reflection on tensions in the DBR paradigm is the recurrent and ongoing challenge to the rigor and validity of DBR, which has asked: Is DBR research at all? Given the interventionist and activist way in which DBR invites the researcher to participate, and the shift in orientation from long-accepted research paradigms, such critiques are hardly surprising, and fall in line with broader challenges to the rigor and objectivity of qualitative social science research in general. Historically, such complaints about DBR are linked to decades of critique of any research that does not adhere to the post-positivist approach set out as the U.S. Department of Education began to prioritize laboratory and large-scale randomized control-trial experimentation as the “gold standard” of research design (e.g., Mosteller & Boruch, 2002 ).

From the outset, DBR, as an interventionist, local, situated, non-laboratory methodology, was bound to run afoul of such conservative trends. While some researchers involved in (particularly traditional developmental and cognitive) DBR have found broader acceptance within these constraints, the rigor of DBR remains contested. It has been suggested that DBR is under-theorized and over-methologized, a haphazard way for researchers to do activist work without engaging in the development of robust knowledge claims about learning (Dede, 2004 ), and an approach lacking in coherence that sheltered interventionist projects of little impact to developing learning theory and allowed researchers to make subjective, pet claims through selective analysis of large bodies of collected data (Kelly, 2003 , 2004 ).

These critiques, however, impose an external set of criteria on DBR, desiring it to fit into the molds of rigor and coherence as defined by canonical methodologies. Bell ( 2004 ) and Bang and Vossoughi ( 2016 ) have made compelling cases for the wide variety of methods and approaches present in DBR not as a fracturing, but as a generative proliferation of different iterations that can offer powerful insights around the different types of questions that exist about learning in the infinitely diverse settings in which it occurs. Essentially, researchers have argued that within the DBR paradigm, and indeed within educational research more generally, the practical impact of research on learning, context, and practices should be a necessary component of rigor (Gutiérrez & Penuel, 2014 ), and the pluralism of methods and approaches available in DBR ensures that the practical impacts and needs of the varied contexts in which the research takes place will always drive the design and research tools.

These moves are emblematic of the way in which DBR is innovating and pushing on paradigms of rigor in educational research altogether, reflecting how DBR fills a complementary niche with respect to other methodologies and attends to elements and challenges of learning in lived, real environments that other types of research have consistently and historically missed. Beyond this, Brown ( 1992 ) was conscious of the concerns around data collection, validity, rigor, and objectivity from the outset, identifying this dilemma—the likelihood of having an incredible amount of data collected in a design only a small fraction of which can be reported and shared, thus leading potentially to selective data analysis and use—as the Bartlett Effect (Brown, 1992 ). Since that time, DBR researchers have been aware of this challenge, actively seeking ways to mitigate this threat to validity by making data sets broadly available, documenting their design, tinkering, and modification processes, clearly situating and describing disconfirming evidence and their own position in the research, and otherwise presenting the broad scope of human and learning activity that occurs within designs in large learning ecologies as comprehensively as possible.

Ultimately, however, these responses are likely to always be insufficient as evidence of rigor to some, for the root dilemma is around what “counts” as education science. While researchers interested and engaged in DBR ought rightly to continue to push themselves to ensure the methodological rigor of their work and chosen methods, it is also worth noting that DBR should seek to hold itself to its own criteria of assessment. This reflects broader trends in qualitative educational research that push back on narrow constructions of what “counts” as science, recognizing the ways in which new methodologies and approaches to research can help us examine aspects of learning, culture, and equity that have continued to be blind spots for traditional education research; invite new voices and perspectives into the process of achieving rigor and validity (Erickson & Gutiérrez, 2002 ); bolster objectivity by bringing it into conversation with the positionality of the researcher (Harding, 1993 ); and perhaps most important, engage in axiological innovation (Bang, Faber, Gurneau, Marin, & Soto, 2016 ), or the exploration of and design for what is, “good right, true, and beautiful . . . in cultural ecologies” (p. 2).

Questions of Generalizability and Usefulness

The generalizability of research results in DBR has been an ongoing and contentious issue in the development of the paradigm. Indeed, by the standards of canonical methods (e.g., laboratory experimentation, ethnography), these local, situated interventions should lack generalizability. While there is reason to discuss and question the merit of generalizability as a goal of qualitative research at all, researchers in the DBR paradigm have long been conscious of this issue. Understanding the question of generalizability around DBR, and how the paradigm has responded to it, can be done in two ways.

First, by distinguishing questions specific to a particular design from the generalizability of the theory. Cole’s (Cole & Underwood, 2013 ) 5th Dimension work, and the nationwide network of linked, theoretically similar sites, operating nationwide with vastly different designs, is a powerful example of this approach to generalizability. Rather than focus on a single, unitary, potentially generalizable design, the project is more interested in variability and sustainability of designs across local contexts (e.g., Cole, 1995 ; Gutiérrez, Bien, Selland, & Pierce, 2011 ; Jurow, Tracy, Hotchkiss, & Kirshner, 2012 ). Through attention to sustainable, locally effective innovations, conscious of the wide variation in culture and context that accompanies any and all learning processes, 5th Dimension sites each derive their idiosyncratic structures from sociocultural theory, sharing some elements, but varying others, while seeking their own “ontological innovations” based on the affordances of their contexts. This pattern reflects a key element of much of the DBR paradigm: that questions of generalizability in DBR may be about the generalizability of the theory of learning, and the variability of learning and design in distinct contexts, rather than the particular design itself.

A second means of addressing generalizability in DBR has been to embrace the pragmatic impacts of designing innovations. This response stems from Messick ( 1992 ) and Schoenfeld’s ( 1992 ) arguments early on in the development of DBR that the consequentialness and validity of DBR efforts as potentially generalizable research depend on the “ usefulness ” of the theories and designs that emerge. Effectively, because DBR is the examination of situated theory, a design must be able to show pragmatic impact—it must succeed at showing the theory to be useful . If there is evidence of usefulness to both the context in which it takes place, and the field of educational research more broadly, then the DBR researcher can stake some broader knowledge claims that might be generalizable. As a result, the DBR paradigm tends to “treat changes in [local] contexts as necessary evidence for the viability of a theory” (Barab & Squire, 2004 , p. 6). This of course does not mean that DBR is only interested in successful efforts. A design that fails or struggles can provide important information and knowledge to the field. Ultimately, though, DBR tends to privilege work that proves the usefulness of designs, whose pragmatic or theoretical findings can then be generalized within the learning science and education research fields.

With this said, the question of usefulness is not always straightforward, and is hardly unitary. While many DBR efforts—particularly those situated in developmental and cognitive learning science traditions—are interested in the generalizability of their useful educational designs (Barab & Squire, 2004 ; Cobb, Confrey, diSessa, Lehrer, & Schauble, 2003 ; Joseph, 2004 ; Steffe & Thompson, 2000 ), not all are. Critical DBR researchers have noted that if usefulness remains situated in the extant sociopolitical and sociocultural power-structures—dominant conceptual and popular definitions of what useful educational outcomes are—the result will be a bar for research merit that inexorably bends toward the positivist spectrum (Booker & Goldman, 2016 ; Dominguez, 2015 ; Zavala, 2016 ). This could potentially, and likely, result in excluding the non-normative interventions and innovations that are vital for historically marginalized communities, but which might have vastly different-looking outcomes, that are nonetheless useful in the sociopolitical context they occur in. Alternative framings to this idea of usefulness push on and extend the intention, and seek to involve the perspectives and agency of situated community partners and their practices in what “counts” as generative and rigorous research outcomes (Gutiérrez & Penuel, 2014 ). An example in this regard is the idea of consequential knowledge (Hall & Jurow, 2015 ; Jurow & Shea, 2015 ), which suggests outcomes that are consequential will be taken up by participants in and across their networks, and over-time—thus a goal of consequential knowledge certainly meets the standard of being useful , but it also implicates the needs and agency of communities in determining the success and merit of a design or research endeavor in important ways that strict usefulness may miss.

Thus, the bar of usefulness that characterizes the DBR paradigm should not be approached without critical reflection. Certainly designs that accomplish little for local contexts should be subject to intense questioning and critique, but considering the sociopolitical and systemic factors that might influence what “counts” as useful in local contexts and education science more generally, should be kept firmly in mind when designing, choosing methods, and evaluating impacts (Zavala, 2016 ). Researchers should think deeply about their goals, whether they are reaching for generalizability at all, and in what ways they are constructing contextual definitions of success, and be clear about these ideologically influenced answers in their work, such that generalizability and the usefulness of designs can be adjudicated based on and in conversation with the intentions and conceptual framework of the research and researcher.

Ethical Concerns of Sustainability, Participation, and Telos

While there are many external challenges to rigor and validity of DBR, another set of tensions comes from within the DBR paradigm itself. Rather than concerns about rigor or validity, these internal critiques are not unrelated to the earlier question of the contested definition of usefulness , and more accurately reflect questions of research ethics and grow from ideological concerns with how an intentional, interventionist stance is taken up in research as it interacts with situated communities.

Given that the nature of DBR is to design and implement some form of educational innovation, the DBR researcher will in some way be engaging with an individual or community, becoming part of a situated learning ecology, complete with a sociopolitical and cultural history. As with any research that involves providing an intervention or support, the question of what happens when the research ends is as much an ethical as a methodological one. Concerns then arise given how traditional models of DBR seem intensely focused on creating and implementing a “complete” cycle of design, but giving little attention to what happens to the community and context afterward (Engeström, 2011 ). In contrast to this privileging of “completeness,” sociocultural and critical approaches to DBR have suggested that if research is actually happening in naturalistic, situated contexts that authentically recognize and allow social and cultural dimensions to function (i.e., avoid laboratory-type controls to mitigate independent variables), there can never be such a thing as “complete,” for the design will, and should, live on as part of the ecology of the space (Cole, 2007 ; Engeström, 2000 ). Essentially, these internal critiques push DBR to consider sustainability, and sustainable scale, as equally important concerns to the completeness of an innovation. Not only are ethical questions involved, but accounting for the unbounded and ongoing nature of learning as a social and cultural activity can help strengthen the viability of knowledge claims made, and what degree of generalizability is reasonably justified.

Related to this question of sustainability are internal concerns regarding the nature and ethics of participation in DBR, whether partners in a design are being adequately invited to engage in the design and modification processes that will unfold in their situated contexts and lived communities (Bang et al., 2016 ; Engeström, 2011 ). DBR has actively sought to examine multiple planes of analysis in learning that might be occurring in a learning ecology but has rarely attended to the subject-subject dynamics (Bang et al., 2016 ), or “relational equity” (DiGiacomo & Gutiérrez, 2015 ) that exists between researchers and participants as a point of focus. Participatory design research (PDR) (Bang & Vossoughi, 2016 ) models have recently emerged as a way to better attend to these important dimensions of collective participation (Engeström, 2007 ), power (Vakil et al., 2016 ), positionality (Kirshner, 2015 ), and relational agency (Edwards, 2007 , 2009 ; Sannino & Engeström, 2016 ) as they unfold in DBR.

Both of these ethical questions—around sustainability and participation—reflect challenges to what we might call the telos —or direction—that DBR takes to innovation and research. These are questions related to whose voices are privileged, in what ways, for what purposes, and toward what ends. While DBR, like many other forms of educational research, has involved work with historically marginalized communities, it has, like many other forms of educational research, not always done so in humanizing ways. Put another way, there are ethical and political questions surrounding whether the designs, goals, and standards of usefulness we apply to DBR efforts should be purposefully activist, and have explicitly liberatory ends. To this point, critical and decolonial perspectives have pushed on the DBR paradigm, suggesting that DBR should situate itself as being a space of liberatory innovation and potential, in which communities and participants can become designers and innovators of their own futures (Gutiérrez, 2005 ). This perspective is reflected in the social design experiment (SDE) approach to DBR (Gutiérrez, 2005 , 2008 ; Gutierréz & Vossoughi, 2010 ; Gutiérrez, 2016 ; Gutiérrez & Jurow, 2016 ), which begins in participatory fashion, engaging a community in identifying its own challenges and desires, and reflecting on the historicity of learning practices, before proleptic design efforts are undertaken that ensure that research is done with , not on , communities of color (Arzubiaga, Artiles, King, & Harris-Murri, 2008 ), and intentionally focused on liberatory goals.

Global Perspectives and Unique Iterations

While design-based research (DBR) has been a methodology principally associated with educational research in the United States, its development is hardly limited to the U.S. context. Rather, while DBR emerged in U.S. settings, similar methods of situated, interventionist research focused on design and innovation were emerging in parallel in European contexts (e.g., Gravemeijer, 1994 ), most significantly in the work of Vygotskian scholars both in Europe and the United States (Cole, 1995 ; Cole & Engeström, 1993 , 2007 ; Engeström, 1987 ).

Particularly, where DBR began in the epistemic and ontological terrain of developmental and cognitive psychology, this vein of design-based research work began deeply grounded in cultural-historical activity theory (CHAT). This ontological and epistemic grounding meant that the approach to design that was taken was more intensively conscious of context, historicity, hybridity, and relational factors, and framed around understanding learning as a complex, collective activity system that, through design, could be modified and transformed (Cole & Engeström, 2007 ). The models of DBR that emerged in this context abroad were the formative intervention (Engeström, 2011 ; Engeström, Sannino, & Virkkunen, 2014 ), which relies heavily on Vygotskian double-stimulation to approach learning in nonlinear, unbounded ways, accounting for the role of learner, educator, and researcher in a collective process, shifting and evolving and tinkering with the design as the context needs and demands; and the Change Laboratory (Engeström, 2008 ; Virkkunen & Newnham, 2013 ), which similarly relies on the principle of double stimulation, while presenting holistic way to approach transforming—or changing—entire learning activity systems in fundamental ways through designs that encourage collective “expansive learning” (Engeström, 2001 ), through which participants can produce wholly new activity systems as the object of learning itself.

Elsewhere in the United States, still parallel to the developmental- or cognitive-oriented DBR work that was occurring, American researchers employing CHAT began to leverage the tools and aims of expansive learning in conversation with the tensions and complexity of the U.S. context (Cole, 1995 ; Gutiérrez, 2005 ; Gutiérrez & Rogoff, 2003 ). Like the CHAT design research of the European context, there was a focus on activity systems, historicity, nonlinear and unbounded learning, and collective learning processes and outcomes. Rather than a simple replication, however, these researchers put further attention on questions of equity, diversity, and justice in this work, as Gutiérrez, Engeström, and Sannino ( 2016 ) note:

The American contribution to a cultural historical activity theoretic perspective has been its attention to diversity, including how we theorize, examine, and represent individuals and their communities. (p. 276)

Effectively, CHAT scholars in parts of the United States brought critical and decolonial perspectives to bear on their design-focused research, focusing explicitly on the complex cultural, racial, and ethnic terrain in which they worked, and ensuring that diversity, equity, justice, and non-dominant perspectives would become central principles to the types of design research conducted. The result was the emergence of the aforementioned social design experiments (e.g., Gutiérrez, 2005 , 2016 ), and participatory design research (Bang & Vossoughi, 2016 ) models, which attend intentionally to historicity and relational equity, tailor their methods to the liberation of historically marginalized communities, aim intentionally for liberatory outcomes as key elements of their design processes, and seek to produce outcomes in which communities of learners become designers of new community futures (Gutiérrez, 2016 ). While these approaches emerged in the United States, their origins reflect ontological and ideological perspectives quite distinct from more traditional learning science models of DBR, and dominant U.S. ontologies in general. Indeed, these iterations of DBR are linked genealogically to the ontologies, ideologies, and concerns of peoples in the Global South, offering some promise for the method in those regions, though DBR has yet to broadly take hold among researchers beyond the United States and Europe.

There is, of course, much more nuance to these models, and each of these models (formative interventions, Change Laboratories, social design experiments, and participatory design research) might itself merit independent exploration and review well beyond the scope here. Indeed, there is some question as to whether all adherents of these CHAT design-based methodologies, with their unique genealogies and histories, would even consider themselves under the umbrella of DBR. Yet, despite significant ontological divergences, these iterations share many of the same foundational tenets of the traditional models (though realized differently), and it is reasonable to argue that they do indeed share the same, broad methodological paradigm (DBR), or at the very least, are so intimately related that any discussion of DBR, particularly one with a global view, should consider the contributions CHAT iterations have made to the DBR methodology in the course of their somewhat distinct, but parallel, development.

Possibilities and Potentials for Design-Based Research

Since its emergence in 1992 , the DBR methodology for educational research has continued to grow in popularity, ubiquity, and significance. Its use has begun to expand beyond the confines of the learning sciences, taken up by researchers in a variety of disciplines, and across a breadth of theoretical and intellectual traditions. While still not as widely recognized as more traditional and well-established research methodologies, DBR as a methodology for rigorous research is unquestionably here to stay.

With this in mind, the field ought to still be cautious of the ways in which the discourse of design is used. Not all design is DBR, and preserving the integrity, rigor, and research ethics of the paradigm (on its own terms) will continue to require thoughtful reflection as its pluralistic parameters come into clearer focus. Yet the proliferation of methods in the DBR paradigm should be seen as a positive. There are far too many theories of learning and ideological perspectives that have meaningful contributions to make to our knowledge of the world, communities, and learning to limit ourselves to a unitary approach to DBR, or set of methods. The paradigm has shown itself to have some core methodological principles, but there is no reason not to expect these to grow, expand, and evolve over time.

In an increasingly globalized, culturally diverse, and dynamic world, there is tremendous potential for innovation couched in this proliferation of DBR. Particularly in historically marginalized communities and across the Global South, we will need to know how learning theories can be lived out in productive ways in communities that have been understudied, and under-engaged. The DBR paradigm generally, and critical and CHAT iterations particularly, can fill an important need for participatory, theory-developing research in these contexts that simultaneously creates lived impacts. Participatory design research (PDR), social design experiments (SDE), and Change Laboratory models of DBR should be of particular interest and attention moving forward, as current trends toward culturally sustaining pedagogies and learning will need to be explored in depth and in close collaboration with communities, as participatory design partners, in the press toward liberatory educational innovations.

Bibliography

The following special issues of journals are encouraged starting points for engaging more deeply with current and past trends in design-based research.

  • Bang, M. , & Vossoughi, S. (Eds.). (2016). Participatory design research and educational justice: Studying learning and relations within social change making [Special issue]. Cognition and Instruction , 34 (3).
  • Barab, S. (Ed.). (2004). Design-based research [Special issue]. Journal of the Learning Sciences , 13 (1).
  • Cole, M. , & The Distributed Literacy Consortium. (2006). The Fifth Dimension: An after-school program built on diversity . New York, NY: Russell Sage Foundation.
  • Kelly, A. E. (Ed.). (2003). Special issue on the role of design in educational research [Special issue]. Educational Researcher , 32 (1).
  • Arzubiaga, A. , Artiles, A. , King, K. , & Harris-Murri, N. (2008). Beyond research on cultural minorities: Challenges and implications of research as situated cultural practice. Exceptional Children , 74 (3), 309–327.
  • Bang, M. , Faber, L. , Gurneau, J. , Marin, A. , & Soto, C. (2016). Community-based design research: Learning across generations and strategic transformations of institutional relations toward axiological innovations. Mind, Culture, and Activity , 23 (1), 28–41.
  • Bang, M. , & Vossoughi, S. (2016). Participatory design research and educational justice: Studying learning and relations within social change making. Cognition and Instruction , 34 (3), 173–193.
  • Barab, S. , Kinster, J. G. , Moore, J. , Cunningham, D. , & The ILF Design Team. (2001). Designing and building an online community: The struggle to support sociability in the Inquiry Learning Forum. Educational Technology Research and Development , 49 (4), 71–96.
  • Barab, S. , & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences , 13 (1), 1–14.
  • Barab, S. A. , & Kirshner, D. (2001). Methodologies for capturing learner practices occurring as part of dynamic learning environments. Journal of the Learning Sciences , 10 (1–2), 5–15.
  • Bell, P. (2004). On the theoretical breadth of design-based research in education. Educational Psychologist , 39 (4), 243–253.
  • Bereiter, C. , & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 361–392). Hillsdale, NJ: Lawrence Erlbaum.
  • Booker, A. , & Goldman, S. (2016). Participatory design research as a practice for systemic repair: Doing hand-in-hand math research with families. Cognition and Instruction , 34 (3), 222–235.
  • Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences , 2 (2), 141–178.
  • Brown, A. , & Campione, J. C. (1996). Psychological theory and the design of innovative learning environments: On procedures, principles, and systems. In L. Schauble & R. Glaser (Eds.), Innovations in learning: New environments for education (pp. 289–325). Mahwah, NJ: Lawrence Erlbaum.
  • Brown, A. L. , & Campione, J. C. (1998). Designing a community of young learners: Theoretical and practical lessons. In N. M. Lambert & B. L. McCombs (Eds.), How students learn: Reforming schools through learner-centered education (pp. 153–186). Washington, DC: American Psychological Association.
  • Brown, A. , Campione, J. , Webber, L. , & McGilley, K. (1992). Interactive learning environments—A new look at learning and assessment. In B. R. Gifford & M. C. O’Connor (Eds.), Future assessment: Changing views of aptitude, achievement, and instruction (pp. 121–211). Boston, MA: Academic Press.
  • Carnoy, M. , Jacobsen, R. , Mishel, L. , & Rothstein, R. (2005). The charter school dust-up: Examining the evidence on enrollment and achievement . Washington, DC: Economic Policy Institute.
  • Carspecken, P. (1996). Critical ethnography in educational research . New York, NY: Routledge.
  • Cobb, P. , Confrey, J. , diSessa, A. , Lehrer, R. , & Schauble, L. (2003). Design experiments in educational research. Educational Researcher , 32 (1), 9–13.
  • Cobb, P. , & Steffe, L. P. (1983). The constructivist researcher as teacher and model builder. Journal for Research in Mathematics Education , 14 , 83–94.
  • Coburn, C. , & Penuel, W. (2016). Research-practice partnerships in education: Outcomes, dynamics, and open questions. Educational Researcher , 45 (1), 48–54.
  • Cole, M. (1995). From Moscow to the Fifth Dimension: An exploration in romantic science. In M. Cole & J. Wertsch (Eds.), Contemporary implications of Vygotsky and Luria (pp. 1–38). Worcester, MA: Clark University Press.
  • Cole, M. (1996). Cultural psychology: A once and future discipline . Cambridge, MA: Harvard University Press.
  • Cole, M. (2007). Sustaining model systems of educational activity: Designing for the long haul. In J. Campione , K. Metz , & A. S. Palinscar (Eds.), Children’s learning in and out of school: Essays in honor of Ann Brown (pp. 71–89). New York, NY: Routledge.
  • Cole, M. , & Engeström, Y. (1993). A cultural historical approach to distributed cognition. In G. Saloman (Ed.), Distributed cognitions: Psychological and educational considerations (pp. 1–46). Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Engeström, Y. (2007). Cultural-historical approaches to designing for development. In J. Valsiner & A. Rosa (Eds.), The Cambridge handbook of sociocultural psychology , Cambridge, U.K.: Cambridge University Press.
  • Cole, M. , & Underwood, C. (2013). The evolution of the 5th Dimension. In The Story of the Laboratory of Comparative Human Cognition: A polyphonic autobiography . https://lchcautobio.ucsd.edu/polyphonic-autobiography/section-5/chapter-12-the-later-life-of-the-5th-dimension-and-its-direct-progeny/ .
  • Collins, A. (1992). Toward a design science of education. In E. Scanlon & T. O’Shea (Eds.), New directions in educational technology (pp. 15–22). New York, NY: Springer-Verlag.
  • Collins, A. , Joseph, D. , & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. Journal of the Learning Sciences , 13 (1), 15–42.
  • Dede, C. (2004). If design-based research is the answer, what is the question? A commentary on Collins, Joseph, and Bielaczyc; DiSessa and Cobb; and Fishman, Marx, Blumenthal, Krajcik, and Soloway in the JLS special issue on design-based research. Journal of the Learning Sciences , 13 (1), 105–114.
  • Design-Based Research Collective . (2003). Design-based research: An emerging paradigm for educational inquiry. Educational Researcher , 32 (1), 5–8.
  • DiGiacomo, D. , & Gutiérrez, K. D. (2015). Relational equity as a design tool within making and tinkering activities. Mind, Culture, and Activity , 22 (3), 1–15.
  • diSessa, A. A. (1991). Local sciences: Viewing the design of human-computer systems as cognitive science. In J. M. Carroll (Ed.), Designing interaction: Psychology at the human-computer interface (pp. 162–202). Cambridge, U.K.: Cambridge University Press.
  • diSessa, A. A. , & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. Journal of the Learning Sciences , 13 (1), 77–103.
  • diSessa, A. A. , & Minstrell, J. (1998). Cultivating conceptual change with benchmark lessons. In J. G. Greeno & S. Goldman (Eds.), Thinking practices (pp. 155–187). Mahwah, NJ: Lawrence Erlbaum.
  • Dominguez, M. (2015). Decolonizing teacher education: Explorations of expansive learning and culturally sustaining pedagogy in a social design experiment (Doctoral dissertation). University of Colorado, Boulder.
  • Edelson, D. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences , 11 (1), 105–121.
  • Edwards, A. (2007). Relational agency in professional practice: A CHAT analysis. Actio: An International Journal of Human Activity Theory , 1 , 1–17.
  • Edwards, A. (2009). Agency and activity theory: From the systemic to the relational. In A. Sannino , H. Daniels , & K. Gutiérrez (Eds.), Learning and expanding with activity theory (pp. 197–211). Cambridge, U.K.: Cambridge University Press.
  • Engeström, Y. (1987). Learning by expanding . Helsinki, Finland: University of Helsinki, Department of Education.
  • Engeström, Y. (2000). Can people learn to master their future? Journal of the Learning Sciences , 9 , 525–534.
  • Engeström, Y. (2001). Expansive learning at work: Toward an activity theoretical reconceptualization. Journal of Education and Work , 14 (1), 133–156.
  • Engeström, Y. (2007). Enriching the theory of expansive learning: Lessons from journeys toward co-configuration. Mind, Culture, and Activity , 14 (1–2), 23–39.
  • Engeström, Y. (2008). Putting Vygotksy to work: The Change Laboratory as an application of double stimulation. In H. Daniels , M. Cole , & J. Wertsch (Eds.), Cambridge companion to Vygotsky (pp. 363–382). New York, NY: Cambridge University Press.
  • Engeström, Y. (2011). From design experiments to formative interventions. Theory & Psychology , 21 (5), 598–628.
  • Engeström, Y. , Engeström, R. , & Kärkkäinen, M. (1995). Polycontextuality and boundary crossing in expert cognition: Learning and problem solving in complex work activities. Learning and Instruction , 5 (4), 319–336.
  • Engeström, Y. , & Sannino, A. (2010). Studies of expansive learning: Foundations, findings and future challenges. Educational Research Review , 5 (1), 1–24.
  • Engeström, Y. , & Sannino, A. (2011). Discursive manifestations of contradictions in organizational change efforts: A methodological framework. Journal of Organizational Change Management , 24 (3), 368–387.
  • Engeström, Y. , Sannino, A. , & Virkkunen, J. (2014). On the methodological demands of formative interventions. Mind, Culture, and Activity , 2 (2), 118–128.
  • Erickson, F. , & Gutiérrez, K. (2002). Culture, rigor, and science in educational research. Educational Researcher , 31 (8), 21–24.
  • Espinoza, M. (2009). A case study of the production of educational sanctuary in one migrant classroom. Pedagogies: An International Journal , 4 (1), 44–62.
  • Espinoza, M. L. , & Vossoughi, S. (2014). Perceiving learning anew: Social interaction, dignity, and educational rights. Harvard Educational Review , 84 (3), 285–313.
  • Fine, M. (1994). Dis-tance and other stances: Negotiations of power inside feminist research. In A. Gitlin (Ed.), Power and method (pp. 13–25). New York, NY: Routledge.
  • Fishman, B. , Penuel, W. , Allen, A. , Cheng, B. , & Sabelli, N. (2013). Design-based implementation research: An emerging model for transforming the relationship of research and practice. National Society for the Study of Education , 112 (2), 136–156.
  • Gravemeijer, K. (1994). Educational development and developmental research in mathematics education. Journal for Research in Mathematics Education , 25 (5), 443–471.
  • Gutiérrez, K. (2005). Intersubjectivity and grammar in the third space . Scribner Award Lecture.
  • Gutiérrez, K. (2008). Developing a sociocritical literacy in the third space. Reading Research Quarterly , 43 (2), 148–164.
  • Gutiérrez, K. (2016). Designing resilient ecologies: Social design experiments and a new social imagination. Educational Researcher , 45 (3), 187–196.
  • Gutiérrez, K. , Bien, A. , Selland, M. , & Pierce, D. M. (2011). Polylingual and polycultural learning ecologies: Mediating emergent academic literacies for dual language learners. Journal of Early Childhood Literacy , 11 (2), 232–261.
  • Gutiérrez, K. , Engeström, Y. , & Sannino, A. (2016). Expanding educational research and interventionist methodologies. Cognition and Instruction , 34 (2), 275–284.
  • Gutiérrez, K. , & Jurow, A. S. (2016). Social design experiments: Toward equity by design. Journal of Learning Sciences , 25 (4), 565–598.
  • Gutiérrez, K. , & Penuel, W. R. (2014). Relevance to practice as a criterion for rigor. Educational Researcher , 43 (1), 19–23.
  • Gutiérrez, K. , & Rogoff, B. (2003). Cultural ways of learning: Individual traits or repertoires of practice. Educational Researcher , 32 (5), 19–25.
  • Gutierréz, K. , & Vossoughi, S. (2010). Lifting off the ground to return anew: Mediated praxis, transformative learning, and social design experiments. Journal of Teacher Education , 61 (1–2), 100–117.
  • Hall, R. , & Jurow, A. S. (2015). Changing concepts in activity: Descriptive and design studies of consequential learning in conceptual practices. Educational Psychologist , 50 (3), 173–189.
  • Harding, S. (1993). Rethinking standpoint epistemology: What is “strong objectivity”? In L. Alcoff & E. Potter (Eds.), Feminist epistemologies (pp. 49–82). New York, NY: Routledge.
  • Hoadley, C. (2002). Creating context: Design-based research in creating and understanding CSCL. In G. Stahl (Ed.), Computer support for collaborative learning 2002 (pp. 453–462). Mahwah, NJ: Lawrence Erlbaum.
  • Hoadley, C. (2004). Methodological alignment in design-based research. Educational Psychologist , 39 (4), 203–212.
  • Joseph, D. (2004). The practice of design-based research: Uncovering the interplay between design, research, and the real-world context. Educational Psychologist , 39 (4), 235–242.
  • Jurow, A. S. , & Shea, M. V. (2015). Learning in equity-oriented scale-making projects. Journal of the Learning Sciences , 24 (2), 286–307.
  • Jurow, S. , Tracy, R. , Hotchkiss, J. , & Kirshner, B. (2012). Designing for the future: How the learning sciences can inform the trajectories of preservice teachers. Journal of Teacher Education , 63 (2), 147–60.
  • Kärkkäinen, M. (1999). Teams as breakers of traditional work practices: A longitudinal study of planning and implementing curriculum units in elementary school teacher teams . Helsinki, Finland: University of Helsinki, Department of Education.
  • Kelly, A. (2004). Design research in education: Yes, but is it methodological? Journal of the Learning Sciences , 13 (1), 115–128.
  • Kelly, A. E. , & Sloane, F. C. (2003). Educational research and the problems of practice. Irish Educational Studies , 22 , 29–40.
  • Kirshner, B. (2015). Youth activism in an era of education inequality . New York: New York University Press.
  • Kirshner, B. , & Polman, J. L. (2013). Adaptation by design: A context-sensitive, dialogic approach to interventions. National Society for the Study of Education Yearbook , 112 (2), 215–236.
  • Leander, K. M. , Phillips, N. C. , & Taylor, K. H. (2010). The changing social spaces of learning: Mapping new mobilities. Review of Research in Education , 34 , 329–394.
  • Lesh, R. A. , & Kelly, A. E. (2000). Multi-tiered teaching experiments. In A. E. Kelly & R. A. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 197–230). Mahwah, NJ: Lawrence Erlbaum.
  • Matusov, E. (1996). Intersubjectivty without agreement. Mind, Culture, and Activity , 3 (1), 29–45.
  • Messick, S. (1992). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher , 23 (2), 13–23.
  • Mosteller, F. , & Boruch, R. F. (Eds.). (2002). Evidence matters: Randomized trials in education research . Washington, DC: Brookings Institution Press.
  • Newman, D. , Griffin, P. , & Cole, M. (1989). The construction zone: Working for cognitive change in school . London, U.K.: Cambridge University Press.
  • Penuel, W. R. , Fishman, B. J. , Cheng, B. H. , & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation, and design. Educational Researcher , 40 (7), 331–337.
  • Polman, J. L. (2000). Designing project-based science: Connecting learners through guided inquiry . New York, NY: Teachers College Press.
  • Ravitch, D. (2010). The death and life of the great American school system: How testing and choice are undermining education . New York, NY: Basic Books.
  • Rogoff, B. (1990). Apprenticeship in thinking: Cognitive development in social context . New York, NY: Oxford University Press.
  • Rogoff, B. (1995). Observing sociocultural activity on three planes: Participatory appropriation, guided participation, and apprenticeship. In J. V. Wertsch , P. D. Rio , & A. Alvarez (Eds.), Sociocultural studies of mind (pp. 139–164). Cambridge U.K.: Cambridge University Press.
  • Saltman, K. J. (2007). Capitalizing on disaster: Taking and breaking public schools . Boulder, CO: Paradigm.
  • Salvador, T. , Bell, G. , & Anderson, K. (1999). Design ethnography. Design Management Journal , 10 (4), 35–41.
  • Sannino, A. (2011). Activity theory as an activist and interventionist theory. Theory & Psychology , 21 (5), 571–597.
  • Sannino, A. , & Engeström, Y. (2016). Relational agency, double stimulation and the object of activity: An intervention study in a primary school. In A. Edwards (Ed.), Working relationally in and across practices: Cultural-historical approaches to collaboration (pp. 58–77). Cambridge, U.K.: Cambridge University Press.
  • Scardamalia, M. , & Bereiter, C. (1991). Higher levels of agency for children in knowledge building: A challenge for the design of new knowledge media. Journal of the Learning Sciences , 1 , 37–68.
  • Schoenfeld, A. H. (1982). Measures of problem solving performance and of problem solving instruction. Journal for Research in Mathematics Education , 13 , 31–49.
  • Schoenfeld, A. H. (1985). Mathematical problem solving . Orlando, FL: Academic Press.
  • Schoenfeld, A. H. (1992). On paradigms and methods: What do you do when the ones you know don’t do what you want them to? Issues in the analysis of data in the form of videotapes. Journal of the Learning Sciences , 2 (2), 179–214.
  • Scribner, S. , & Cole, M. (1978). Literacy without schooling: Testing for intellectual effects. Harvard Educational Review , 48 (4), 448–461.
  • Shavelson, R. J. , Phillips, D. C. , Towne, L. , & Feuer, M. J. (2003). On the science of education design studies. Educational Researcher , 32 (1), 25–28.
  • Steffe, L. P. , & Thompson, P. W. (2000). Teaching experiment methodology: Underlying principles and essential elements. In A. Kelly & R. Lesh (Eds.), Handbook of research design in mathematics and science education (pp. 267–307). Mahwah, NJ: Erlbaum.
  • Stevens, R. (2000). Divisions of labor in school and in the workplace: Comparing computer and paper-supported activities across settings. Journal of the Learning Sciences , 9 (4), 373–401.
  • Suchman, L. (1995). Making work visible. Communications of the ACM , 38 (9), 57–64.
  • Vakil, S. , de Royston, M. M. , Nasir, N. , & Kirshner, B. (2016). Rethinking race and power in design-based research: Reflections from the field. Cognition and Instruction , 34 (3), 194–209.
  • van den Akker, J. (1999). Principles and methods of development research. In J. van den Akker , R. M. Branch , K. Gustafson , N. Nieveen , & T. Plomp (Eds.), Design approaches and tools in education and training (pp. 1–14). Boston, MA: Kluwer Academic.
  • Virkkunen, J. , & Newnham, D. (2013). The Change Laboratory: A tool for collaborative development of work and education . Rotterdam, The Netherlands: Sense.
  • White, B. Y. , & Frederiksen, J. R. (1998). Inquiry, modeling, and metacognition: Making science accessible to all students. Cognition and Instruction , 16 , 3–118.
  • Zavala, M. (2016). Design, participation, and social change: What design in grassroots spaces can teach learning scientists. Cognition and Instruction , 34 (3), 236–249.

1. The reader should note the emergence of critical ethnography (e.g., Carspecken, 1996 ; Fine, 1994 ), and other more participatory models of ethnography that deviated from this traditional paradigm during this same time period. These new forms of ethnography comprised part of the genealogy of the more critical approaches to DBR, described later in this article.

2. The reader will also note that the adjective “qualitative” largely drops away from the acronym “DBR.” This is largely because, as described, DBR, as an exploration of naturalistic ecologies with multitudes of variables, and social and learning dynamics, necessarily demands a move beyond what can be captured by quantitative measurement alone. The qualitative nature of the research is thus implied and embedded as part of what makes DBR a unique and distinct methodology.

Related Articles

  • Qualitative Data Analysis
  • The Entanglements of Ethnography and Participatory Action Research (PAR) in Educational Research in North America
  • Writing Educational Ethnography
  • Qualitative Data Analysis and the Use of Theory
  • Comparative Case Study Research
  • Use of Qualitative Methods in Evaluation Studies
  • Writing Qualitative Dissertations
  • Ethnography in Early Childhood Education
  • A History of Qualitative Research in Education in China
  • Qualitative Research in the Field of Popular Education
  • Qualitative Methodological Considerations for Studying Undocumented Students in the United States
  • Culturally Responsive Evaluation as a Form of Critical Qualitative Inquiry
  • Participatory Action Research in Education
  • Complexity Theory as a Guide to Qualitative Methodology in Teacher Education
  • Observing Schools and Classrooms

Printed from Oxford Research Encyclopedias, Education. Under the terms of the licence agreement, an individual user may print out a single article for personal use (for details see Privacy Policy and Legal Notice).

date: 20 April 2024

  • Cookie Policy
  • Privacy Policy
  • Legal Notice
  • Accessibility
  • [66.249.64.20|81.177.182.174]
  • 81.177.182.174

Character limit 500 /500

Qualitative Research: Characteristics, Design, Methods & Examples

Lauren McCall

MSc Health Psychology Graduate

MSc, Health Psychology, University of Nottingham

Lauren obtained an MSc in Health Psychology from The University of Nottingham with a distinction classification.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

“Not everything that can be counted counts, and not everything that counts can be counted“ (Albert Einstein)

Qualitative research is a process used for the systematic collection, analysis, and interpretation of non-numerical data (Punch, 2013). 

Qualitative research can be used to: (i) gain deep contextual understandings of the subjective social reality of individuals and (ii) to answer questions about experience and meaning from the participant’s perspective (Hammarberg et al., 2016).

Unlike quantitative research, which focuses on gathering and analyzing numerical data for statistical analysis, qualitative research focuses on thematic and contextual information.

Characteristics of Qualitative Research 

Reality is socially constructed.

Qualitative research aims to understand how participants make meaning of their experiences – individually or in social contexts. It assumes there is no objective reality and that the social world is interpreted (Yilmaz, 2013). 

The primacy of subject matter 

The primary aim of qualitative research is to understand the perspectives, experiences, and beliefs of individuals who have experienced the phenomenon selected for research rather than the average experiences of groups of people (Minichiello, 1990).

Variables are complex, interwoven, and difficult to measure

Factors such as experiences, behaviors, and attitudes are complex and interwoven, so they cannot be reduced to isolated variables , making them difficult to measure quantitatively.

However, a qualitative approach enables participants to describe what, why, or how they were thinking/ feeling during a phenomenon being studied (Yilmaz, 2013). 

Emic (insider’s point of view)

The phenomenon being studied is centered on the participants’ point of view (Minichiello, 1990).

Emic is used to describe how participants interact, communicate, and behave in the context of the research setting (Scarduzio, 2017).

Why Conduct Qualitative Research? 

In order to gain a deeper understanding of how people experience the world, individuals are studied in their natural setting. This enables the researcher to understand a phenomenon close to how participants experience it. 

Qualitative research allows researchers to gain an in-depth understanding, which is difficult to attain using quantitative methods. 

An in-depth understanding is attained since qualitative techniques allow participants to freely disclose their experiences, thoughts, and feelings without constraint (Tenny et al., 2022). 

This helps to further investigate and understand quantitative data by discovering reasons for the outcome of a study – answering the why question behind statistics. 

The exploratory nature of qualitative research helps to generate hypotheses that can then be tested quantitatively (Busetto et al., 2020).

To design hypotheses, theory must be researched using qualitative methods to find out what is important in order to begin research. 

For example, by conducting interviews or focus groups with key stakeholders to discover what is important to them. 

Examples of qualitative research questions include: 

  • How does stress influence young adults’ behavior?
  • What factors influence students’ school attendance rates in developed countries?
  • How do adults interpret binge drinking in the UK?
  • What are the psychological impacts of cervical cancer screening in women?
  • How can mental health lessons be integrated into the school curriculum? 

Collecting Qualitative Data

There are four main research design methods used to collect qualitative data: observations, interviews,  focus groups, and ethnography.

Observations

This method involves watching and recording phenomena as they occur in nature. Observation can be divided into two types: participant and non-participant observation.

In participant observation, the researcher actively participates in the situation/events being observed.

In non-participant observation, the researcher is not an active part of the observation and tries not to influence the behaviors they are observing (Busetto et al., 2020). 

Observations can be covert (participants are unaware that a researcher is observing them) or overt (participants are aware of the researcher’s presence and know they are being observed).

However, awareness of an observer’s presence may influence participants’ behavior. 

Interviews give researchers a window into the world of a participant by seeking their account of an event, situation, or phenomenon. They are usually conducted on a one-to-one basis and can be distinguished according to the level at which they are structured (Punch, 2013). 

Structured interviews involve predetermined questions and sequences to ensure replicability and comparability. However, they are unable to explore emerging issues.

Informal interviews consist of spontaneous, casual conversations which are closer to the truth of a phenomenon. However, information is gathered using quick notes made by the researcher and is therefore subject to recall bias. 

Semi-structured interviews have a flexible structure, phrasing, and placement so emerging issues can be explored (Denny & Weckesser, 2022).

The use of probing questions and clarification can lead to a detailed understanding, but semi-structured interviews can be time-consuming and subject to interviewer bias. 

Focus groups 

Similar to interviews, focus groups elicit a rich and detailed account of an experience. However, focus groups are more dynamic since participants with shared characteristics construct this account together (Denny & Weckesser, 2022).

A shared narrative is built between participants to capture a group experience shaped by a shared context. 

The researcher takes on the role of a moderator, who will establish ground rules and guide the discussion by following a topic guide to focus the group discussions.

Typically, focus groups have 4-10 participants as a discussion can be difficult to facilitate with more than this, and this number allows everyone the time to speak.

Ethnography

Ethnography is a methodology used to study a group of people’s behaviors and social interactions in their environment (Reeves et al., 2008).

Data are collected using methods such as observations, field notes, or structured/ unstructured interviews.

The aim of ethnography is to provide detailed, holistic insights into people’s behavior and perspectives within their natural setting. In order to achieve this, researchers immerse themselves in a community or organization. 

Due to the flexibility and real-world focus of ethnography, researchers are able to gather an in-depth, nuanced understanding of people’s experiences, knowledge and perspectives that are influenced by culture and society.

In order to develop a representative picture of a particular culture/ context, researchers must conduct extensive field work. 

This can be time-consuming as researchers may need to immerse themselves into a community/ culture for a few days, or possibly a few years.

Qualitative Data Analysis Methods

Different methods can be used for analyzing qualitative data. The researcher chooses based on the objectives of their study. 

The researcher plays a key role in the interpretation of data, making decisions about the coding, theming, decontextualizing, and recontextualizing of data (Starks & Trinidad, 2007). 

Grounded theory

Grounded theory is a qualitative method specifically designed to inductively generate theory from data. It was developed by Glaser and Strauss in 1967 (Glaser & Strauss, 2017).

 This methodology aims to develop theories (rather than test hypotheses) that explain a social process, action, or interaction (Petty et al., 2012). To inform the developing theory, data collection and analysis run simultaneously. 

There are three key types of coding used in grounded theory: initial (open), intermediate (axial), and advanced (selective) coding. 

Throughout the analysis, memos should be created to document methodological and theoretical ideas about the data. Data should be collected and analyzed until data saturation is reached and a theory is developed. 

Content analysis

Content analysis was first used in the early twentieth century to analyze textual materials such as newspapers and political speeches.

Content analysis is a research method used to identify and analyze the presence and patterns of themes, concepts, or words in data (Vaismoradi et al., 2013). 

This research method can be used to analyze data in different formats, which can be written, oral, or visual. 

The goal of content analysis is to develop themes that capture the underlying meanings of data (Schreier, 2012). 

Qualitative content analysis can be used to validate existing theories, support the development of new models and theories, and provide in-depth descriptions of particular settings or experiences.

The following six steps provide a guideline for how to conduct qualitative content analysis.
  • Define a Research Question : To start content analysis, a clear research question should be developed.
  • Identify and Collect Data : Establish the inclusion criteria for your data. Find the relevant sources to analyze.
  • Define the Unit or Theme of Analysis : Categorize the content into themes. Themes can be a word, phrase, or sentence.
  • Develop Rules for Coding your Data : Define a set of coding rules to ensure that all data are coded consistently.
  • Code the Data : Follow the coding rules to categorize data into themes.
  • Analyze the Results and Draw Conclusions : Examine the data to identify patterns and draw conclusions in relation to your research question.

Discourse analysis

Discourse analysis is a research method used to study written/ spoken language in relation to its social context (Wood & Kroger, 2000).

In discourse analysis, the researcher interprets details of language materials and the context in which it is situated.

Discourse analysis aims to understand the functions of language (how language is used in real life) and how meaning is conveyed by language in different contexts. Researchers use discourse analysis to investigate social groups and how language is used to achieve specific communication goals.

Different methods of discourse analysis can be used depending on the aims and objectives of a study. However, the following steps provide a guideline on how to conduct discourse analysis.
  • Define the Research Question : Develop a relevant research question to frame the analysis.
  • Gather Data and Establish the Context : Collect research materials (e.g., interview transcripts, documents). Gather factual details and review the literature to construct a theory about the social and historical context of your study.
  • Analyze the Content : Closely examine various components of the text, such as the vocabulary, sentences, paragraphs, and structure of the text. Identify patterns relevant to the research question to create codes, then group these into themes.
  • Review the Results : Reflect on the findings to examine the function of the language, and the meaning and context of the discourse. 

Thematic analysis

Thematic analysis is a method used to identify, interpret, and report patterns in data, such as commonalities or contrasts. 

Although the origin of thematic analysis can be traced back to the early twentieth century, understanding and clarity of thematic analysis is attributed to Braun and Clarke (2006).

Thematic analysis aims to develop themes (patterns of meaning) across a dataset to address a research question. 

In thematic analysis, qualitative data is gathered using techniques such as interviews, focus groups, and questionnaires. Audio recordings are transcribed. The dataset is then explored and interpreted by a researcher to identify patterns. 

This occurs through the rigorous process of data familiarisation, coding, theme development, and revision. These identified patterns provide a summary of the dataset and can be used to address a research question.

Themes are developed by exploring the implicit and explicit meanings within the data. Two different approaches are used to generate themes: inductive and deductive. 

An inductive approach allows themes to emerge from the data. In contrast, a deductive approach uses existing theories or knowledge to apply preconceived ideas to the data.

Phases of Thematic Analysis

Braun and Clarke (2006) provide a guide of the six phases of thematic analysis. These phases can be applied flexibly to fit research questions and data. 

Template analysis

Template analysis refers to a specific method of thematic analysis which uses hierarchical coding (Brooks et al., 2014).

Template analysis is used to analyze textual data, for example, interview transcripts or open-ended responses on a written questionnaire.

To conduct template analysis, a coding template must be developed (usually from a subset of the data) and subsequently revised and refined. This template represents the themes identified by researchers as important in the dataset. 

Codes are ordered hierarchically within the template, with the highest-level codes demonstrating overarching themes in the data and lower-level codes representing constituent themes with a narrower focus.

A guideline for the main procedural steps for conducting template analysis is outlined below.
  • Familiarization with the Data : Read (and reread) the dataset in full. Engage, reflect, and take notes on data that may be relevant to the research question.
  • Preliminary Coding : Identify initial codes using guidance from the a priori codes, identified before the analysis as likely to be beneficial and relevant to the analysis.
  • Organize Themes : Organize themes into meaningful clusters. Consider the relationships between the themes both within and between clusters.
  • Produce an Initial Template : Develop an initial template. This may be based on a subset of the data.
  • Apply and Develop the Template : Apply the initial template to further data and make any necessary modifications. Refinements of the template may include adding themes, removing themes, or changing the scope/title of themes. 
  • Finalize Template : Finalize the template, then apply it to the entire dataset. 

Frame analysis

Frame analysis is a comparative form of thematic analysis which systematically analyzes data using a matrix output.

Ritchie and Spencer (1994) developed this set of techniques to analyze qualitative data in applied policy research. Frame analysis aims to generate theory from data.

Frame analysis encourages researchers to organize and manage their data using summarization.

This results in a flexible and unique matrix output, in which individual participants (or cases) are represented by rows and themes are represented by columns. 

Each intersecting cell is used to summarize findings relating to the corresponding participant and theme.

Frame analysis has five distinct phases which are interrelated, forming a methodical and rigorous framework.
  • Familiarization with the Data : Familiarize yourself with all the transcripts. Immerse yourself in the details of each transcript and start to note recurring themes.
  • Develop a Theoretical Framework : Identify recurrent/ important themes and add them to a chart. Provide a framework/ structure for the analysis.
  • Indexing : Apply the framework systematically to the entire study data.
  • Summarize Data in Analytical Framework : Reduce the data into brief summaries of participants’ accounts.
  • Mapping and Interpretation : Compare themes and subthemes and check against the original transcripts. Group the data into categories and provide an explanation for them.

Preventing Bias in Qualitative Research

To evaluate qualitative studies, the CASP (Critical Appraisal Skills Programme) checklist for qualitative studies can be used to ensure all aspects of a study have been considered (CASP, 2018).

The quality of research can be enhanced and assessed using criteria such as checklists, reflexivity, co-coding, and member-checking. 

Co-coding 

Relying on only one researcher to interpret rich and complex data may risk key insights and alternative viewpoints being missed. Therefore, coding is often performed by multiple researchers.

A common strategy must be defined at the beginning of the coding process  (Busetto et al., 2020). This includes establishing a useful coding list and finding a common definition of individual codes.

Transcripts are initially coded independently by researchers and then compared and consolidated to minimize error or bias and to bring confirmation of findings. 

Member checking

Member checking (or respondent validation) involves checking back with participants to see if the research resonates with their experiences (Russell & Gregory, 2003).

Data can be returned to participants after data collection or when results are first available. For example, participants may be provided with their interview transcript and asked to verify whether this is a complete and accurate representation of their views.

Participants may then clarify or elaborate on their responses to ensure they align with their views (Shenton, 2004).

This feedback becomes part of data collection and ensures accurate descriptions/ interpretations of phenomena (Mays & Pope, 2000). 

Reflexivity in qualitative research

Reflexivity typically involves examining your own judgments, practices, and belief systems during data collection and analysis. It aims to identify any personal beliefs which may affect the research. 

Reflexivity is essential in qualitative research to ensure methodological transparency and complete reporting. This enables readers to understand how the interaction between the researcher and participant shapes the data.

Depending on the research question and population being researched, factors that need to be considered include the experience of the researcher, how the contact was established and maintained, age, gender, and ethnicity.

These details are important because, in qualitative research, the researcher is a dynamic part of the research process and actively influences the outcome of the research (Boeije, 2014). 

Reflexivity Example

Who you are and your characteristics influence how you collect and analyze data. Here is an example of a reflexivity statement for research on smoking. I am a 30-year-old white female from a middle-class background. I live in the southwest of England and have been educated to master’s level. I have been involved in two research projects on oral health. I have never smoked, but I have witnessed how smoking can cause ill health from my volunteering in a smoking cessation clinic. My research aspirations are to help to develop interventions to help smokers quit.

Establishing Trustworthiness in Qualitative Research

Trustworthiness is a concept used to assess the quality and rigor of qualitative research. Four criteria are used to assess a study’s trustworthiness: credibility, transferability, dependability, and confirmability.

Credibility in Qualitative Research

Credibility refers to how accurately the results represent the reality and viewpoints of the participants.

To establish credibility in research, participants’ views and the researcher’s representation of their views need to align (Tobin & Begley, 2004).

To increase the credibility of findings, researchers may use data source triangulation, investigator triangulation, peer debriefing, or member checking (Lincoln & Guba, 1985). 

Transferability in Qualitative Research

Transferability refers to how generalizable the findings are: whether the findings may be applied to another context, setting, or group (Tobin & Begley, 2004).

Transferability can be enhanced by giving thorough and in-depth descriptions of the research setting, sample, and methods (Nowell et al., 2017). 

Dependability in Qualitative Research

Dependability is the extent to which the study could be replicated under similar conditions and the findings would be consistent.

Researchers can establish dependability using methods such as audit trails so readers can see the research process is logical and traceable (Koch, 1994).

Confirmability in Qualitative Research

Confirmability is concerned with establishing that there is a clear link between the researcher’s interpretations/ findings and the data.

Researchers can achieve confirmability by demonstrating how conclusions and interpretations were arrived at (Nowell et al., 2017).

This enables readers to understand the reasoning behind the decisions made. 

Audit Trails in Qualitative Research

An audit trail provides evidence of the decisions made by the researcher regarding theory, research design, and data collection, as well as the steps they have chosen to manage, analyze, and report data. 

The researcher must provide a clear rationale to demonstrate how conclusions were reached in their study.

A clear description of the research path must be provided to enable readers to trace through the researcher’s logic (Halpren, 1983).

Researchers should maintain records of the raw data, field notes, transcripts, and a reflective journal in order to provide a clear audit trail. 

Discovery of unexpected data

Open-ended questions in qualitative research mean the researcher can probe an interview topic and enable the participant to elaborate on responses in an unrestricted manner.

This allows unexpected data to emerge, which can lead to further research into that topic. 

Flexibility

Data collection and analysis can be modified and adapted to take the research in a different direction if new ideas or patterns emerge in the data.

This enables researchers to investigate new opportunities while firmly maintaining their research goals. 

Naturalistic settings

The behaviors of participants are recorded in real-world settings. Studies that use real-world settings have high ecological validity since participants behave more authentically. 

Limitations

Time-consuming .

Qualitative research results in large amounts of data which often need to be transcribed and analyzed manually.

Even when software is used, transcription can be inaccurate, and using software for analysis can result in many codes which need to be condensed into themes. 

Subjectivity 

The researcher has an integral role in collecting and interpreting qualitative data. Therefore, the conclusions reached are from their perspective and experience.

Consequently, interpretations of data from another researcher may vary greatly. 

Limited generalizability

The aim of qualitative research is to provide a detailed, contextualized understanding of an aspect of the human experience from a relatively small sample size.

Despite rigorous analysis procedures, conclusions drawn cannot be generalized to the wider population since data may be biased or unrepresentative.

Therefore, results are only applicable to a small group of the population. 

Extraneous variables

Qualitative research is often conducted in real-world settings. This may cause results to be unreliable since extraneous variables may affect the data, for example:

  • Situational variables : different environmental conditions may influence participants’ behavior in a study. The random variation in factors (such as noise or lighting) may be difficult to control in real-world settings.
  • Participant characteristics : this includes any characteristics that may influence how a participant answers/ behaves in a study. This may include a participant’s mood, gender, age, ethnicity, sexual identity, IQ, etc.
  • Experimenter effect : experimenter effect refers to how a researcher’s unintentional influence can change the outcome of a study. This occurs when (i) their interactions with participants unintentionally change participants’ behaviors or (ii) due to errors in observation, interpretation, or analysis. 

What sample size should qualitative research be?

The sample size for qualitative studies has been recommended to include a minimum of 12 participants to reach data saturation (Braun, 2013).

Are surveys qualitative or quantitative?

Surveys can be used to gather information from a sample qualitatively or quantitatively. Qualitative surveys use open-ended questions to gather detailed information from a large sample using free text responses.

The use of open-ended questions allows for unrestricted responses where participants use their own words, enabling the collection of more in-depth information than closed-ended questions.

In contrast, quantitative surveys consist of closed-ended questions with multiple-choice answer options. Quantitative surveys are ideal to gather a statistical representation of a population.

What are the ethical considerations of qualitative research?

Before conducting a study, you must think about any risks that could occur and take steps to prevent them. Participant Protection : Researchers must protect participants from physical and mental harm. This means you must not embarrass, frighten, offend, or harm participants. Transparency : Researchers are obligated to clearly communicate how they will collect, store, analyze, use, and share the data. Confidentiality : You need to consider how to maintain the confidentiality and anonymity of participants’ data.

What is triangulation in qualitative research?

Triangulation refers to the use of several approaches in a study to comprehensively understand phenomena. This method helps to increase the validity and credibility of research findings. 

Types of triangulation include method triangulation (using multiple methods to gather data); investigator triangulation (multiple researchers for collecting/ analyzing data), theory triangulation (comparing several theoretical perspectives to explain a phenomenon), and data source triangulation (using data from various times, locations, and people; Carter et al., 2014).

Why is qualitative research important?

Qualitative research allows researchers to describe and explain the social world. The exploratory nature of qualitative research helps to generate hypotheses that can then be tested quantitatively.

In qualitative research, participants are able to express their thoughts, experiences, and feelings without constraint.

Additionally, researchers are able to follow up on participants’ answers in real-time, generating valuable discussion around a topic. This enables researchers to gain a nuanced understanding of phenomena which is difficult to attain using quantitative methods.

What is coding data in qualitative research?

Coding data is a qualitative data analysis strategy in which a section of text is assigned with a label that describes its content.

These labels may be words or phrases which represent important (and recurring) patterns in the data.

This process enables researchers to identify related content across the dataset. Codes can then be used to group similar types of data to generate themes.

What is the difference between qualitative and quantitative research?

Qualitative research involves the collection and analysis of non-numerical data in order to understand experiences and meanings from the participant’s perspective.

This can provide rich, in-depth insights on complicated phenomena. Qualitative data may be collected using interviews, focus groups, or observations.

In contrast, quantitative research involves the collection and analysis of numerical data to measure the frequency, magnitude, or relationships of variables. This can provide objective and reliable evidence that can be generalized to the wider population.

Quantitative data may be collected using closed-ended questionnaires or experiments.

What is trustworthiness in qualitative research?

Trustworthiness is a concept used to assess the quality and rigor of qualitative research. Four criteria are used to assess a study’s trustworthiness: credibility, transferability, dependability, and confirmability. 

Credibility refers to how accurately the results represent the reality and viewpoints of the participants. Transferability refers to whether the findings may be applied to another context, setting, or group.

Dependability is the extent to which the findings are consistent and reliable. Confirmability refers to the objectivity of findings (not influenced by the bias or assumptions of researchers).

What is data saturation in qualitative research?

Data saturation is a methodological principle used to guide the sample size of a qualitative research study.

Data saturation is proposed as a necessary methodological component in qualitative research (Saunders et al., 2018) as it is a vital criterion for discontinuing data collection and/or analysis. 

The intention of data saturation is to find “no new data, no new themes, no new coding, and ability to replicate the study” (Guest et al., 2006). Therefore, enough data has been gathered to make conclusions.

Why is sampling in qualitative research important?

In quantitative research, large sample sizes are used to provide statistically significant quantitative estimates.

This is because quantitative research aims to provide generalizable conclusions that represent populations.

However, the aim of sampling in qualitative research is to gather data that will help the researcher understand the depth, complexity, variation, or context of a phenomenon. The small sample sizes in qualitative studies support the depth of case-oriented analysis.

Boeije, H. (2014). Analysis in qualitative research. Sage.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative research in psychology , 3 (2), 77-101. https://doi.org/10.1191/1478088706qp063oa

Brooks, J., McCluskey, S., Turley, E., & King, N. (2014). The utility of template analysis in qualitative psychology research. Qualitative Research in Psychology , 12 (2), 202–222. https://doi.org/10.1080/14780887.2014.955224

Busetto, L., Wick, W., & Gumbinger, C. (2020). How to use and assess qualitative research methods. Neurological research and practice , 2 (1), 14-14. https://doi.org/10.1186/s42466-020-00059-z 

Carter, N., Bryant-Lukosius, D., DiCenso, A., Blythe, J., & Neville, A. J. (2014). The use of triangulation in qualitative research. Oncology nursing forum , 41 (5), 545–547. https://doi.org/10.1188/14.ONF.545-547

Critical Appraisal Skills Programme. (2018). CASP Checklist: 10 questions to help you make sense of a Qualitative research. https://casp-uk.net/images/checklist/documents/CASP-Qualitative-Studies-Checklist/CASP-Qualitative-Checklist-2018_fillable_form.pdf Accessed: March 15 2023

Clarke, V., & Braun, V. (2013). Successful qualitative research: A practical guide for beginners. Successful Qualitative Research , 1-400.

Denny, E., & Weckesser, A. (2022). How to do qualitative research?: Qualitative research methods. BJOG : an international journal of obstetrics and gynaecology , 129 (7), 1166-1167. https://doi.org/10.1111/1471-0528.17150 

Glaser, B. G., & Strauss, A. L. (2017). The discovery of grounded theory. The Discovery of Grounded Theory , 1–18. https://doi.org/10.4324/9780203793206-1

Guest, G., Bunce, A., & Johnson, L. (2006). How many interviews are enough? An experiment with data saturation and variability. Field Methods, 18 (1), 59-82. doi:10.1177/1525822X05279903

Halpren, E. S. (1983). Auditing naturalistic inquiries: The development and application of a model (Unpublished doctoral dissertation). Indiana University, Bloomington.

Hammarberg, K., Kirkman, M., & de Lacey, S. (2016). Qualitative research methods: When to use them and how to judge them. Human Reproduction , 31 (3), 498–501. https://doi.org/10.1093/humrep/dev334

Koch, T. (1994). Establishing rigour in qualitative research: The decision trail. Journal of Advanced Nursing, 19, 976–986. doi:10.1111/ j.1365-2648.1994.tb01177.x

Lincoln, Y., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.

Mays, N., & Pope, C. (2000). Assessing quality in qualitative research. BMJ, 320(7226), 50–52.

Minichiello, V. (1990). In-Depth Interviewing: Researching People. Longman Cheshire.

Nowell, L. S., Norris, J. M., White, D. E., & Moules, N. J. (2017). Thematic Analysis: Striving to Meet the Trustworthiness Criteria. International Journal of Qualitative Methods, 16 (1). https://doi.org/10.1177/1609406917733847

Petty, N. J., Thomson, O. P., & Stew, G. (2012). Ready for a paradigm shift? part 2: Introducing qualitative research methodologies and methods. Manual Therapy , 17 (5), 378–384. https://doi.org/10.1016/j.math.2012.03.004

Punch, K. F. (2013). Introduction to social research: Quantitative and qualitative approaches. London: Sage

Reeves, S., Kuper, A., & Hodges, B. D. (2008). Qualitative research methodologies: Ethnography. BMJ , 337 (aug07 3). https://doi.org/10.1136/bmj.a1020

Russell, C. K., & Gregory, D. M. (2003). Evaluation of qualitative research studies. Evidence Based Nursing, 6 (2), 36–40.

Saunders, B., Sim, J., Kingstone, T., Baker, S., Waterfield, J., Bartlam, B., Burroughs, H., & Jinks, C. (2018). Saturation in qualitative research: exploring its conceptualization and operationalization. Quality & quantity , 52 (4), 1893–1907. https://doi.org/10.1007/s11135-017-0574-8

Scarduzio, J. A. (2017). Emic approach to qualitative research. The International Encyclopedia of Communication Research Methods, 1–2 . https://doi.org/10.1002/9781118901731.iecrm0082

Schreier, M. (2012). Qualitative content analysis in practice / Margrit Schreier.

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for Information, 22 , 63–75.

Starks, H., & Trinidad, S. B. (2007). Choose your method: a comparison of phenomenology, discourse analysis, and grounded theory. Qualitative health research , 17 (10), 1372–1380. https://doi.org/10.1177/1049732307307031

Tenny, S., Brannan, J. M., & Brannan, G. D. (2022). Qualitative Study. In StatPearls. StatPearls Publishing.

Tobin, G. A., & Begley, C. M. (2004). Methodological rigour within a qualitative framework. Journal of Advanced Nursing, 48, 388–396. doi:10.1111/j.1365-2648.2004.03207.x

Vaismoradi, M., Turunen, H., & Bondas, T. (2013). Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study. Nursing & health sciences , 15 (3), 398-405. https://doi.org/10.1111/nhs.12048

Wood L. A., Kroger R. O. (2000). Doing discourse analysis: Methods for studying action in talk and text. Sage.

Yilmaz, K. (2013). Comparison of Quantitative and Qualitative Research Traditions: epistemological, theoretical, and methodological differences. European journal of education , 48 (2), 311-325. https://doi.org/10.1111/ejed.12014

Print Friendly, PDF & Email

  • Reviews / Why join our community?
  • For companies
  • Frequently asked questions

Qualitative Research

What is qualitative research.

Qualitative research is the methodology researchers use to gain deep contextual understandings of users via non-numerical means and direct observations. Researchers focus on smaller user samples—e.g., in interviews—to reveal data such as user attitudes, behaviors and hidden factors: insights which guide better designs.

“ There are also unknown unknowns, things we don’t know we don’t know.” — Donald Rumsfeld, Former U.S. Secretary of Defense
  • Transcript loading…

See how you can use qualitative research to expose hidden truths about users and iteratively shape better products.

Qualitative Research Focuses on the “Why”

Qualitative research is a subset of user experience (UX) research and user research . By doing qualitative research, you aim to gain narrowly focused but rich information about why users feel and think the ways they do. Unlike its more statistics-oriented “counterpart”, quantitative research , qualitative research can help expose hidden truths about your users’ motivations, hopes, needs, pain points and more to help you keep your project’s focus on track throughout development. UX design professionals do qualitative research typically from early on in projects because—since the insights they reveal can alter product development dramatically—they can prevent costly design errors from arising later. Compare and contrast qualitative with quantitative research here:

Qualitative research

Quantitative Research

You Aim to Determine

The “why” – to get behind how users approach their problems in their world

The “what”, “where” & “when” of the users’ needs & problems – to help keep your project’s focus on track during development

Loosely structured (e.g., contextual inquiries) – to learn why users behave how they do & explore their opinions

Highly structured (e.g., surveys) – to gather data about what users do & find patterns in large user groups

Number of Representative Users

Often around 5

Ideally 30+

Level of Contact with Users

More direct & less remote (e.g., usability testing to examine users’ stress levels when they use your design)

Less direct & more remote (e.g., analytics)

Statistically

You need to take great care with handling non-numerical data (e.g., opinions), as your own opinions might influence findings

Reliable – given enough test users

Regarding care with opinions, it’s easy to be subjective about qualitative data, which isn’t as comprehensively analyzable as quantitative data. That’s why design teams also apply quantitative research methods, to reinforce the “why” with the “what”.

Qualitative Research Methods You Can Use to Get Behind Your Users

You have a choice of many methods to help gain the clearest insights into your users’ world – which you might want to complement with quantitative research methods. In iterative processes such as user-centered design , you/your design team would use quantitative research to spot design problems, discover the reasons for these with qualitative research, make changes and then test your improved design on users again. The best method/s to pick will depend on the stage of your project and your objectives. Here are some:

Diary studies – You ask users to document their activities, interactions, etc. over a defined period. This empowers users to deliver context-rich information. Although such studies can be subjective—since users will inevitably be influenced by in-the-moment human issues and their emotions—they’re helpful tools to access generally authentic information.

Structured – You ask users specific questions and analyze their responses with other users’.

Semi-structured – You have a more free-flowing conversation with users, but still follow a prepared script loosely.

Ethnographic – You interview users in their own environment to appreciate how they perform tasks and view aspects of tasks.

How to Structure a User Interview

Usability testing

Moderated – In-person testing in, e.g., a lab.

Unmoderated – Users complete tests remotely: e.g., through a video call.

Guerrilla – “Down-the-hall”/“down-and-dirty” testing on a small group of random users or colleagues.

How to Plan a Usability Test

User observation – You watch users get to grips with your design and note their actions, words and reactions as they attempt to perform tasks.

description of qualitative research design

Qualitative research can be more or less structured depending on the method.

Qualitative Research – How to Get Reliable Results

Some helpful points to remember are:

Participants – Select a number of test users carefully (typically around 5). Observe the finer points such as body language. Remember the difference between what they do and what they say they do.

Moderated vs. unmoderated – You can obtain the richest data from moderated studies, but these can involve considerable time and practice. You can usually conduct unmoderated studies more quickly and cheaply, but you should plan these carefully to ensure instructions are clear, etc.

Types of questions – You’ll learn far more by asking open-ended questions. Avoid leading users’ answers – ask about their experience during, say, the “search for deals” process rather than how easy it was. Try to frame questions so users respond honestly: i.e., so they don’t withhold grievances about their experience because they don’t want to seem impolite. Distorted feedback may also arise in guerrilla testing, as test users may be reluctant to sound negative or to discuss fine details if they lack time.

Location – Think how where users are might affect their performance and responses. If, for example, users’ tasks involve running or traveling on a train, select the appropriate method (e.g., diary studies for them to record aspects of their experience in the environment of a train carriage and the many factors impacting it).

Overall, no single research method can help you answer all your questions. Nevertheless, The Nielsen Norman Group advise that if you only conduct one kind of user research, you should pick qualitative usability testing, since a small sample size can yield many cost- and project-saving insights. Always treat users and their data ethically. Finally, remember the importance of complementing qualitative methods with quantitative ones: You gain insights from the former; you test those using the latter.

Learn More about Qualitative Research

Take our course on User Research to see how to get the most from qualitative research.

Read about the numerous considerations for qualitative research in this in-depth piece.

This blog discusses the importance of qualitative research , with tips.

Explore additional insights into qualitative research here .

Literature on Qualitative Research

Here’s the entire UX literature on Qualitative Research by the Interaction Design Foundation, collated in one place:

Learn more about Qualitative Research

Take a deep dive into Qualitative Research with our course User Research – Methods and Best Practices .

How do you plan to design a product or service that your users will love , if you don't know what they want in the first place? As a user experience designer, you shouldn't leave it to chance to design something outstanding; you should make the effort to understand your users and build on that knowledge from the outset. User research is the way to do this, and it can therefore be thought of as the largest part of user experience design .

In fact, user research is often the first step of a UX design process—after all, you cannot begin to design a product or service without first understanding what your users want! As you gain the skills required, and learn about the best practices in user research, you’ll get first-hand knowledge of your users and be able to design the optimal product—one that’s truly relevant for your users and, subsequently, outperforms your competitors’ .

This course will give you insights into the most essential qualitative research methods around and will teach you how to put them into practice in your design work. You’ll also have the opportunity to embark on three practical projects where you can apply what you’ve learned to carry out user research in the real world . You’ll learn details about how to plan user research projects and fit them into your own work processes in a way that maximizes the impact your research can have on your designs. On top of that, you’ll gain practice with different methods that will help you analyze the results of your research and communicate your findings to your clients and stakeholders—workshops, user journeys and personas, just to name a few!

By the end of the course, you’ll have not only a Course Certificate but also three case studies to add to your portfolio. And remember, a portfolio with engaging case studies is invaluable if you are looking to break into a career in UX design or user research!

We believe you should learn from the best, so we’ve gathered a team of experts to help teach this course alongside our own course instructors. That means you’ll meet a new instructor in each of the lessons on research methods who is an expert in their field—we hope you enjoy what they have in store for you!

All open-source articles on Qualitative Research

How to do a thematic analysis of user interviews.

description of qualitative research design

  • 1.2k shares
  • 3 years ago

How to Visualize Your Qualitative User Research Results for Maximum Impact

description of qualitative research design

  • 2 years ago

Creating Personas from User Research Results

description of qualitative research design

Best Practices for Qualitative User Research

description of qualitative research design

Card Sorting

description of qualitative research design

Contextual Interviews and How to Handle Them

description of qualitative research design

Understand the User’s Perspective through Research for Mobile UX

description of qualitative research design

  • 10 mths ago

Ethnography

7 simple ways to get better results from ethnographic research.

description of qualitative research design

Semi-structured qualitative studies

Pros and cons of conducting user interviews.

description of qualitative research design

Workshops to Establish Empathy and Understanding from User Research Results

description of qualitative research design

How to Moderate User Interviews

description of qualitative research design

  • 4 years ago

Question Everything

description of qualitative research design

Adding Quality to Your Design Research with an SSQS Checklist

description of qualitative research design

  • 8 years ago

Open Access—Link to us!

We believe in Open Access and the  democratization of knowledge . Unfortunately, world-class educational materials such as this page are normally hidden behind paywalls or in expensive textbooks.

If you want this to change , cite this page , link to us, or join us to help us democratize design knowledge !

Privacy Settings

Our digital services use necessary tracking technologies, including third-party cookies, for security, functionality, and to uphold user rights. Optional cookies offer enhanced features, and analytics.

Experience the full potential of our site that remembers your preferences and supports secure sign-in.

Governs the storage of data necessary for maintaining website security, user authentication, and fraud prevention mechanisms.

Enhanced Functionality

Saves your settings and preferences, like your location, for a more personalized experience.

Referral Program

We use cookies to enable our referral program, giving you and your friends discounts.

Error Reporting

We share user ID with Bugsnag and NewRelic to help us track errors and fix issues.

Optimize your experience by allowing us to monitor site usage. You’ll enjoy a smoother, more personalized journey without compromising your privacy.

Analytics Storage

Collects anonymous data on how you navigate and interact, helping us make informed improvements.

Differentiates real visitors from automated bots, ensuring accurate usage data and improving your website experience.

Lets us tailor your digital ads to match your interests, making them more relevant and useful to you.

Advertising Storage

Stores information for better-targeted advertising, enhancing your online ad experience.

Personalization Storage

Permits storing data to personalize content and ads across Google services based on user behavior, enhancing overall user experience.

Advertising Personalization

Allows for content and ad personalization across Google services based on user behavior. This consent enhances user experiences.

Enables personalizing ads based on user data and interactions, allowing for more relevant advertising experiences across Google services.

Receive more relevant advertisements by sharing your interests and behavior with our trusted advertising partners.

Enables better ad targeting and measurement on Meta platforms, making ads you see more relevant.

Allows for improved ad effectiveness and measurement through Meta’s Conversions API, ensuring privacy-compliant data sharing.

LinkedIn Insights

Tracks conversions, retargeting, and web analytics for LinkedIn ad campaigns, enhancing ad relevance and performance.

LinkedIn CAPI

Enhances LinkedIn advertising through server-side event tracking, offering more accurate measurement and personalization.

Google Ads Tag

Tracks ad performance and user engagement, helping deliver ads that are most useful to you.

Share the knowledge!

Share this content on:

or copy link

Cite according to academic standards

Simply copy and paste the text below into your bibliographic reference list, onto your blog, or anywhere else. You can also just hyperlink to this page.

New to UX Design? We’re Giving You a Free ebook!

The Basics of User Experience Design

Download our free ebook The Basics of User Experience Design to learn about core concepts of UX design.

In 9 chapters, we’ll cover: conducting user interviews, design thinking, interaction design, mobile UX design, usability, UX research, and many more!

Logo for Open Educational Resources Collective

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Chapter 5: Qualitative descriptive research

Darshini Ayton

Learning outcomes

Upon completion of this chapter, you should be able to:

  • Identify the key terms and concepts used in qualitative descriptive research.
  • Discuss the advantages and disadvantages of qualitative descriptive research.

What is a qualitative descriptive study?

The key concept of the qualitative descriptive study is description.

Qualitative descriptive studies (also known as ‘exploratory studies’ and ‘qualitative description approaches’) are relatively new in the qualitative research landscape. They emerged predominantly in the field of nursing and midwifery over the past two decades. 1 The design of qualitative descriptive studies evolved as a means to define aspects of qualitative research that did not resemble qualitative research designs to date, despite including elements of those other study designs. 2

Qualitative descriptive studies  describe  phenomena rather than explain them. Phenomenological studies, ethnographic studies and those using grounded theory seek to explain a phenomenon. Qualitative descriptive studies aim to provide a comprehensive summary of events. The approach to this study design is journalistic, with the aim being to answer the questions who, what, where and how. 3

A qualitative descriptive study is an important and appropriate design for research questions that are focused on gaining insights about a poorly understood research area, rather than on a specific phenomenon. Since qualitative descriptive study design seeks to describe rather than explain, explanatory frameworks and theories are not required to explain or ‘ground’ a study and its results. 4 The researcher may decide that a framework or theory adds value to their interpretations, and in that case, it is perfectly acceptable to use them. However, the hallmark of genuine curiosity (naturalistic enquiry) is that the researcher does not know in advance what they will be observing or describing. 4 Because a phenomenon is being described, the qualitative descriptive analysis is more categorical and less conceptual than other methods. Qualitative content analysis is usually the main approach to data analysis in qualitative descriptive studies. 4 This has led to criticism of descriptive research being less sophisticated because less interpretation is required than with other qualitative study designs in which interpretation and explanation are key characteristics (e.g. phenomenology, grounded theory, case studies).

Diverse approaches to data collection can be utilised in qualitative description studies. However, most qualitative descriptive studies use semi-structured interviews (see Chapter 13) because they provide a reliable way to collect data. 3 The technique applied to data analysis is generally categorical and less conceptual when compared to other qualitative research designs (see Section 4). 2,3 Hence, this study design is well suited to research by practitioners, student researchers and policymakers. Its straightforward approach enables these studies to be conducted in shorter timeframes than other study designs. 3 Descriptive studies are common as the qualitative component in mixed-methods research ( see Chapter 11 ) and evaluations ( see Chapter 12 ), 1 because qualitative descriptive studies can provide information to help develop and refine questionnaires or interventions.

For example, in our research to develop a patient-reported outcome measure for people who had undergone a percutaneous coronary intervention (PCI), which is a common cardiac procedure to treat heart disease, we started by conducting a qualitative descriptive study. 5 This project was a large, mixed-methods study funded by a private health insurer. The entire research process needed to be straightforward and achievable within a year, as we had engaged an undergraduate student to undertake the research tasks. The aim of the qualitative component of the mixed-methods study was to identify and explore patients’ perceptions following PCI. We used inductive approaches to collect and analyse the data. The study was guided by the following domains for the development of patient-reported outcomes, according to US Food and Drug Administration (FDA) guidelines, which included:

  • Feeling: How the patient feels physically and psychologically after medical intervention
  • Function: The patient’s mobility and ability to maintain their regular routine
  • Evaluation: The patient’s overall perception of the success or failure of their procedure and their perception of what contributed to it. 5(p458)

We conducted focus groups and interviews, and asked participants three questions related to the FDA outcome domains:

  • From your perspective, what would be considered a successful outcome of the procedure?

Probing questions: Did the procedure meet your expectations? How do you define whether the procedure was successful?

  • How did you feel after the procedure?

Probing question: How did you feel one week after and how does that compare with how you feel now?

  • After your procedure, tell me about your ability to do your daily activities?

Prompt for activities including gardening, housework, personal care, work-related and family-related tasks.

Probing questions: Did you attend cardiac rehabilitation? Can you tell us about your experience of cardiac rehabilitation? What impact has medication had on your recovery?

  • What, if any, lifestyle changes have you made since your procedure? 5(p459)

Data collection was conducted with 32 participants. The themes were mapped to the FDA patient-reported outcome domains, with the results confirming previous research and also highlighting new areas for exploration in the development of a new patient-reported outcome measure. For example, participants reported a lack of confidence following PCI and the importance of patient and doctor communication. Women, in particular, reported that they wanted doctors to recognise how their experiences of cardiac symptoms were different to those of men.

The study described phenomena and resulted in the development of a patient-reported outcome measure that was tested and refined using a discrete-choice experiment survey, 6 a pilot of the measure in the Victorian Cardiac Outcomes Registry and a Rasch analysis to validate the measurement’s properties. 7

Advantages and disadvantages of qualitative descriptive studies

A qualitative descriptive study is an effective design for research by practitioners, policymakers and students, due to their relatively short timeframes and low costs. The researchers can remain close to the data and the events described, and this can enable the process of analysis to be relatively simple. Qualitative descriptive studies are also useful in mixed-methods research studies. Some of the advantages of qualitative descriptive studies have led to criticism of the design approach, due to a lack of engagement with theory and the lack of interpretation and explanation of the data. 2

Table 5.1. Examples of qualitative descriptive studies

Qualitative descriptive studies are gaining popularity in health and social care due to their utility, from a resource and time perspective, for research by practitioners, policymakers and researchers. Descriptive studies can be conducted as stand-alone studies or as part of larger, mixed-methods studies.

  • Bradshaw C, Atkinson S, Doody O. Employing a qualitative description approach in health care research. Glob Qual Nurs Res. 2017;4. doi:10.1177/2333393617742282
  • Lambert VA, Lambert CE. Qualitative descriptive research: an acceptable design. Pac Rim Int J Nurs Res Thail. 2012;16(4):255-256. Accessed June 6, 2023. https://he02.tci-thaijo.org/index.php/PRIJNR/article/download/5805/5064
  • Doyle L et al. An overview of the qualitative descriptive design within nursing research. J Res Nurs. 2020;25(5):443-455. doi:10.1177/174498711988023
  • Kim H, Sefcik JS, Bradway C. Characteristics of qualitative descriptive studies: a systematic review. Res Nurs Health. 2017;40(1):23-42. doi:10.1002/nur.21768
  • Ayton DR et al. Exploring patient-reported outcomes following percutaneous coronary intervention: a qualitative study. Health Expect. 2018;21(2):457-465. doi:10.1111/hex.1263
  • Barker AL et al. Symptoms and feelings valued by patients after a percutaneous coronary intervention: a discrete-choice experiment to inform development of a new patient-reported outcome. BMJ Open. 2018;8:e023141. doi:10.1136/bmjopen-2018-023141
  • Soh SE et al. What matters most to patients following percutaneous coronary interventions? a new patient-reported outcome measure developed using Rasch analysis. PLoS One. 2019;14(9):e0222185. doi:10.1371/journal.pone.0222185
  • Hiller RM et al. Coping and support-seeking in out-of-home care: a qualitative study of the views of young people in care in England. BMJ Open. 2021;11:e038461. doi:10.1136/bmjopen-2020-038461
  • Backman C, Cho-Young D. Engaging patients and informal caregivers to improve safety and facilitate person- and family-centered care during transitions from hospital to home – a qualitative descriptive study. Patient Prefer Adherence. 2019;13:617-626. doi:10.2147/PPA.S201054

Qualitative Research – a practical guide for health and social care researchers and practitioners Copyright © 2023 by Darshini Ayton is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

Share This Book

AD Center Site Banner

  • Section 2: Home
  • Developing the Quantitative Research Design
  • Qualitative Descriptive Design

Overview of Descriptive Design

Sources of data.

  • Design and Development Research (DDR) For Instructional Design
  • Qualitative Narrative Inquiry Research
  • Action Research Resource
  • Case Study Design in an Applied Doctorate
  • SAGE Research Methods
  • Research Examples (SAGE) This link opens in a new window
  • Dataset Examples (SAGE) This link opens in a new window
  • IRB Resource Center This link opens in a new window

A descriptive design is a flexible, exploratory approach to qualitative research. Descriptive design is referred to in the literature by other labels including generic, general, basic, traditional, interpretive, and pragmatic. Descriptive design as an acceptable research design for dissertation and other robust scholarly research has received varying degrees of acceptance within the academic community. However, descriptive design has been gaining momentum since the early 2000’s as a suitable design for studies that do not fall into the more mainstream genres of qualitative research (ie. Case study, phenomenology, ethnography, narrative inquiry and grounded theory). In contrast to other qualitative designs, descriptive design is not aligned to specific methods (for example, bracketing in phenomenology, bounded systems in case study, or constant comparative analysis in grounded theory). Rather, descriptive design “borrows” methods appropriate to the proposed study from other designs. 

Arguments supporting the flexible nature of descriptive designs describe it as being preferable to forcing a research approach into a design that is not quite appropriate for the nature of the intended study. However, descriptive design has also been criticized for this mixing of methods as well as for the limited literature describing it. The descriptive design can be the foundation for a rigorous study within the ADE program. Because of the flexibility of the methods used, a descriptive design provides the researcher with the opportunity to choose methods best suited to a practice-based research purpose.   

  • Example Descriptive Design in an Applied Doctorate

Sources of Data in Descriptive Design

Because of the exploratory nature of descriptive design, the triangulation of multiple sources of data are often used for additional insight into the phenomenon. Sources of data that can be used in descriptive studies are similar to those that may be used in other qualitative designs and include interviews, focus groups, documents, artifacts, and observations.

The following video provides additional considerations for triangulation in qualitative designs including descriptive design: Triangulation: Pairing Thematic and Content Analysis

  • << Previous: Developing the Qualitative Research Design
  • Next: Design and Development Research (DDR) For Instructional Design >>
  • Last Updated: Jul 28, 2023 8:05 AM
  • URL: https://resources.nu.edu/c.php?g=1013605

NCU Library Home

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.25(5); 2020 Aug

Logo of jrn

An overview of the qualitative descriptive design within nursing research

Louise doyle.

Associate Professor in Mental Health Nursing, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Catherine McCabe

Associate Professor in General Nursing, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Brian Keogh

Assistant Professor in Mental Health Nursing, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Annemarie Brady

Chair of Nursing and Chronic Illness, School of Nursing and Midwifery, Trinity College Dublin, Ireland

Qualitative descriptive designs are common in nursing and healthcare research due to their inherent simplicity, flexibility and utility in diverse healthcare contexts. However, the application of descriptive research is sometimes critiqued in terms of scientific rigor. Inconsistency in decision making within the research process coupled with a lack of transparency has created issues of credibility for this type of approach. It can be difficult to clearly differentiate what constitutes a descriptive research design from the range of other methodologies at the disposal of qualitative researchers.

This paper provides an overview of qualitative descriptive research, orientates to the underlying philosophical perspectives and key characteristics that define this approach and identifies the implications for healthcare practice and policy.

Methods and results

Using real-world examples from healthcare research, the paper provides insight to the practical application of descriptive research at all stages of the design process and identifies the critical elements that should be explicit when applying this approach.

Conclusions

By adding to the existing knowledge base, this paper enhances the information available to researchers who wish to use the qualitative descriptive approach, influencing the standard of how this approach is employed in healthcare research.

Introduction

Qualitative descriptive approaches to nursing and healthcare research provide a broad insight into particular phenomena and can be used in a variety of ways including as a standalone research design, as a precursor to larger qualitative studies and commonly as the qualitative component in mixed-methods studies. Despite the widespread use of descriptive approaches within nursing research, there is limited methodological guidance about this type of design in research texts or papers. The lack of adequate representation in research texts has at times resulted in novice researchers using other more complex qualitative designs including grounded theory or phenomenology without meeting the requirements of these approaches ( Lambert and Lambert, 2012 ), or having an appropriate rationale for use of these approaches. This suggests there is a need to have more discussion about how and why descriptive approaches to qualitative research are used. This serves to not only provide information and guidance for researchers, but to ensure acceptable standards in how this approach is applied in healthcare research.

Rationale for qualitative descriptive research

The selection of an appropriate approach to answer research questions is one of the most important stages of the research process; consequently, there is a requirement that researchers can clearly articulate and defend their selection. Those who wish to undertake qualitative research have a range of approaches available to them including grounded theory, phenomenology and ethnography. However, these designs may not be the most suitable for studies that do not require a deeply theoretical context and aim to stay close to and describe participants’ experiences. The most frequently proposed rationale for the use of a descriptive approach to is to provide straightforward descriptions of experiences and perceptions ( Sandelowski, 2010 ), particularly in areas where little is known about the topic under investigation. A qualitative descriptive design may be deemed most appropriate as it recognises the subjective nature of the problem, the different experiences participants have and will present the findings in a way that directly reflects or closely resembles the terminology used in the initial research question ( Bradshaw et al., 2017 ). This is particularly relevant in nursing and healthcare research, which is commonly concerned with how patients experience illness and associated healthcare interventions. The utilisation of a qualitative descriptive approach is often encouraged in Master’s level nurse education programmes as it enables novice clinical nurse researchers explore important healthcare questions that have direct implications and impact for their specific healthcare setting (Colorafi and Evans, 2016). As a Master’s level project is often the first piece of primary research undertaken by nurses, the use of a qualitative descriptive design provides an excellent method to address important clinical issues where the focus is not on increasing theoretical or conceptual understanding, but rather contributing to change and quality improvement in the practice setting ( Chafe, 2017 ).

This design is also frequently used within mixed-methods studies where qualitative data can explain quantitative findings in explanatory studies, be used for questionnaire development in exploratory studies and validate and corroborate findings in convergent studies ( Doyle et al., 2016 ). There has also been an increase in the use of qualitative descriptive research embedded in large-scale healthcare intervention studies, which can serve a number of purposes including identifying participants’ perceptions of why an intervention worked or, just as importantly, did not work and how the intervention might be improved ( Doyle et al., 2016 ). Using qualitative descriptive research in this manner can help to make the findings of intervention studies more clinically meaningful.

Philosophical and theoretical influences

Qualitative descriptive research generates data that describe the ‘who, what, and where of events or experiences’ from a subjective perspective ( Kim et al., 2017 , p. 23). From a philosophical perspective, this approach to research is best aligned with constructionism and critical theories that use interpretative and naturalistic methods ( Lincoln et al., 2017 ). These philosophical perspectives represent the view that reality exists within various contexts that are dynamic and perceived differently depending on the subject, therefore, reality is multiple and subjective ( Lincoln et al., 2017 ). In qualitative descriptive research, this translates into researchers being concerned with understanding the individual human experience in its unique context. This type of inquiry requires flexible research processes that are inductive and dynamic but do not transform the data beyond recognition from the phenomenon being studied ( Ormston et al., 2014 ; Sandelwoski 2010). Descriptive qualitative research has also been aligned with pragmatism ( Neergaard et al., 2009 ) where decisions are made about how the research should be conducted based on the aims or objectives and context of the study ( Ormston et al., 2014 ). The pragmatist researcher is not aligned to one particular view of knowledge generation or one particular methodology. Instead they look to the concepts or phenomena being studied to guide decision making in the research process, facilitating the selection of the most appropriate methods to answer the research question ( Bishop, 2015 ).

Perhaps linked to the practical application of pragmatism to research, that is, applying the best methods to answer the research question, is the classification of qualitative descriptive research by Sandelowski ( 2010 , p. 82) into a ‘distributed residual category’. This recognises and incorporates uncertainty about the phenomena being studied and the research methods used to study them. For researchers, it permits the use of one or more different types of inquiry, which is essential when acknowledging and exploring different realities and subjective experiences in relation to phenomena ( Long et al., 2018 ). Clarity, in terms of the rationale for the phenomenon being studied and the methods used by the researcher, emerges from the qualitative descriptive approach because the data gathered continue to remain close to the phenomenon throughout the study ( Sandelowski, 2010 ). For this to happen a flexible approach is required and this is evident in the practice of ‘borrowing’ elements of other qualitative methodologies such as grounded theory, phenomenology and ethnography ( Vaismoradi et al., 2013 ).

Regarded as a positive aspect by many researchers who are interested in studying human nature and phenomenon, others believe this flexibility leads to inconsistency across studies and in some cases complacency by researchers. This can result in vague or unexplained decision making around the research process and subsequent lack of credibility. Accordingly, nurse researchers need to be reflexive, that is, clear about their role and position in terms of the phenomena being studied, the context, the theoretical framework and all decision-making processes used in a qualitative descriptive study. This adds credibility to both the study and qualitative descriptive research.

Methods in qualitative descriptive research

As with any research study, the application of descriptive methods will emerge in response to the aims and objectives, which will influence the sampling, data collection and analysis phases of the study.

Most qualitative research aligns itself with non-probability sampling and descriptive research is no different. Descriptive research generally uses purposive sampling and a range of purposive sampling techniques have been described ( Palinkas et al., 2015 ). Many researchers use a combination of approaches such as convenience, opportunistic or snowball sampling as part of the sampling framework, which is determined by the desired sample and the phenomena being studied.

Purposive sampling refers to selecting research participants that can speak to the research aims and who have knowledge and experience of the phenomenon under scrutiny ( Ritchie et al., 2014 ). When purposive sampling is used in a study it delimits and narrows the study population; however, researchers need to remember that other characteristics of the sample will also affect the population, such as the location of the researcher and their flexibility to recruit participants from beyond their base. In addition, the heterogeneity of the population will need to be considered and how this might influence sampling and subsequent data collection and analysis ( Palinkas et al ., 2015 ). Take, for example, conducting research on the experience of caring for people with Alzheimer’s disease (AD). For the most part AD is a condition that affects older people and experiences of participants caring for older people will ultimately dominate the sample. However, AD also affects younger people and how this will impact on sampling needs to be considered before recruitment as both groups will have very different experiences, although there will be overlap. Teddlie and Fu (2007) suggest that although some purposive sampling techniques generate representative cases, most result in describing contrasting cases, which they argue are at the heart of qualitative analysis. To achieve this, Sandelowski (2010) suggests that maximum variation sampling is particularly useful in qualitative descriptive research, which may acknowledge the range of experiences that exist especially in healthcare research. Palinkas et al . (2015) describe maximum variation sampling as identifying shared patterns that emerge from heterogeneity. In other words, researchers attempt to include a wide range of participants and experiences when collecting data. This may be more difficult to achieve in areas where little is known about the substantive area and may depend on the researcher’s knowledge and immersion within the subject area.

Sample size will also need to be considered and although small sample sizes are common in qualitative descriptive research, researchers need to be careful they have enough data collected to meet the study aims ( Ritchie et al., 2014 ). Pre-determining the sample size prior to data collection may stifle the analytic process, resulting in too much or too little data. Traditionally, the gold standard for sample size in qualitative research is data saturation, which differs depending on the research design and the size of the population ( Fusch and Ness, 2015 ). Data saturation is reached ‘when there is enough information to replicate the study, when the ability to obtain additional new information has been attained, and when further coding is no longer feasible’ ( Fusch and Ness, 2015 , p. 1408). However, some argue that although saturation is often reported, it is rarely demonstrated in qualitative descriptive research reports ( Caelli et al., 2003 ; Malterud et al., 2016 ). If data saturation is used to determine sample size, it is suggested that greater emphasis be placed on demonstrating how saturation was reached and at what level to provide more credibility to sample sizes ( Caelli et al., 2003 ). Sample size calculation should be an estimate until saturation has been achieved through the concurrent processes of data collection and analysis. Where saturation has not been achieved, or where sample size has been predetermined for resource reasons, this should be clearly acknowledged. However, there is also a movement away from the reliance on data saturation as a measure of sample size in qualitative research ( Malterud et al., 2016 ). O’Reilly and Parker (2012) question the appropriateness of the rigid application of saturation as a sample size measure arguing that outside of Grounded Theory, its use is inconsistent and at times questionable. Malterud et al. (2016) focus instead on the concept of ‘information power’ to determine sample size. Here, they suggest sample size is determined by the amount of information the sample holds relevant to the actual study rather than the number of participants ( Malterud et al., 2016 ). Some guidance on specific sample size depending on research design has been provided in the literature; however, these are sometimes conflicting and in some cases lack evidence to support their claims ( Guest et al., 2006 ). This is further complicated by the range of qualitative designs and data collection approaches available.

Data collection

Data collection methods in qualitative descriptive research are diverse and aim to discover the who, what and where of phenomena ( Sandelowski, 2000 ). Although semi-structured individual face-to-face interviews are the most commonly used data collection approaches ( Kim et al ., 2017 ), focus groups, telephone interviews and online approaches are also used.

Focus groups involve people with similar characteristics coming together in a relaxed and permissive environment to share their thoughts, experiences and insights ( Krueger and Casey, 2009 ). Participants share their own views and experiences, but also listen to and reflect on the experiences of other group members. It is this synergistic process of interacting with other group members that refines individuals’ viewpoints to a deeper and more considered level and produces data and insights that would not be accessible without the interaction found in a group (Finch et al., 2014). Telephone interviews and online approaches are gaining more traction as they offer greater flexibility and reduced costs for researchers and ease of access for participants. In addition, they may help to achieve maximum variation sampling or examine experiences from a national or international perspective. Face-to-face interviews are often perceived as more appropriate than telephone interviews; however, this assumption has been challenged as evidence to support the use of telephone interviews emerges ( Ward et al., 2015 ). Online data collection also offers the opportunity to collect synchronous and asynchronous data using instant messaging and other online media ( Hooley et al., 2011 ). Online interviews or focus groups conducted via Skype or other media may overcome some of the limitations of telephone interviews, although observation of non-verbal communication may be more difficult to achieve ( Janghorban et al., 2014 ). Open-ended free-text responses in surveys have also been identified as useful data sources in qualitative descriptive studies ( Kim et al . , 2017 ) and in particular the use of online open-ended questions, which can have a large geographical reach ( Seixas et al., 2018 ). Observation is also cited as an approach to data collection in qualitative descriptive research ( Sandelowski, 2000 ; Lambert and Lambert, 2012 ); however, in a systematic review examining the characteristics of qualitative research studies, observation was cited as an additional source of data and was not used as a primary source of data collection ( Kim et al. , 2017 ).

Data analysis and interpretation

According to Lambert and Lambert (2012) , data analysis in qualitative descriptive research is data driven and does not use an approach that has emerged from a pre-existing philosophical or epistemological perspective. Within qualitative descriptive research, it is important analysis is kept at a level at which those to whom the research pertains are easily able to understand and so can use the findings in healthcare practice ( Chafe, 2017 ). The approach to analysis is dictated by the aims of the research and as qualitative descriptive research is generally explorative, inductive approaches will commonly need to be applied although deductive approaches can also be used ( Kim et al . , 2017 ).

Content and thematic analyses are the most commonly used data analysis techniques in qualitative descriptive research. Vaismoradi et al . (2013) argue that content and thematic analysis, although poorly understood and unevenly applied, offer legitimate ways of a lower level of interpretation that is often required in qualitative descriptive research. Sandelowski (2000) indicated that qualitative content analysis is the approach of choice in descriptive research; however, confusion exists between content and thematic analysis, which sometimes means researchers use a combination of the two. Vaismoradi et al. (2013) argue there are differences between the two and that content analysis allows the researchers to analyse the data qualitatively as well as being able to quantify the data whereas thematic analysis provides a purely qualitative account of the data that is richer and more detailed. Decisions to use one over the other will depend on the aims of the study, which will dictate the depth of analysis required. Although there is a range of analysis guidelines available, they share some characteristics and an overview of these, derived from some key texts ( Sandleowski, 2010 ; Braun and Clark, 2006 ; Newell and Burnard, 2006), is presented in Table 1 . Central to these guidelines is an attempt by the researcher to immerse themselves in the data and the ability to demonstrate a consistent and systematic approach to the analysis.

Common characteristics of descriptive qualitative analysis.

Coding in qualitative descriptive research can be inductive and emerge from the data, or a priori where they are based on a pre-determined template as in template analysis. Inductive codes can be ‘in vivo’ where the researcher uses the words or concepts as stated by the participants ( Howitt, 2019 ), or can be named by the researcher and grouped together to form emerging themes or categories through an iterative systematic process until the final themes emerge. Template analysis involves designing a coding template, which is designed inductively from a subset of the data and then applied to all the data and refined as appropriate ( King, 2012 ). It offers a standardised approach that may be useful when several researchers are involved in the analysis process.

Within qualitative research studies generally, the analysis of data and subsequent presentation of research findings can range from studies with a relatively minimal amount of interpretation to those with high levels of interpretation ( Sandelowski and Barroso, 2003 ). The degree of interpretation required in qualitative descriptive research is contentious. Sandelowski (2010) argues that although descriptive research produces findings that are ‘data-near’, they are nevertheless interpretative. Sandelowski (2010) reports that a common misconception in qualitative descriptive designs is that researchers do not need to include any level of analysis and interpretation and can rely solely on indiscriminately selecting direct quotations from participants to answer the research question(s). Although it is important to ensure those familiar with the topic under investigation can recognise their experiences in the description of it ( Kim et al . , 2017 ), this is not to say that there should be no transformation of data. Researchers using a qualitative descriptive design need to, through data analysis, move from un-interpreted participant quotations to interpreted research findings, which can still remain ‘data-near’ ( Sandeklwoski, 2010 ). Willis et al. (2016) suggest that researchers using the qualitative descriptive method might report a comprehensive thematic summary as findings, which moves beyond individual participant reports by developing an interpretation of a common theme. The extent of description and/or interpretation in a qualitative descriptive study is ultimately determined by the focus of the study (Neergard et al ., 2009).

As with any research design, ensuring the rigor or trustworthiness of findings from a qualitative descriptive study is crucial. For a more detailed consideration of the quality criteria in qualitative studies, readers are referred to the seminal work of Lincoln and Guba (1985) in which the four key criteria of credibility, dependability, confirmability and transferability are discussed. At the very least, researchers need to be clear about the methodological decisions taken during the study so readers can judge the trustworthiness of the study and ultimately the findings ( Hallberg, 2013 ). Being aware of personal assumptions and the role they play in the research process is also an important quality criterion (Colorafi and Evans, 2016) and these assumptions can be made explicit through the use of researcher reflexivity in the study ( Bradshaw et al., 2017 ).

Challenges in using a qualitative descriptive design

One of the challenges of utilising a qualitative descriptive design is responding to the charge that many qualitative designs have historically encountered, which is that qualitative designs lack the scientific rigor associated with quantitative approaches ( Vaismoradi et al . , 2013 ). The descriptive design faces further critique in this regard as, unlike other qualitative approaches such as phenomenology or grounded theory, it is not theory driven or oriented ( Neergaard et al ., 2009 ). However, it is suggested that this perceived limitation of qualitative descriptive research only holds true if it is used for the wrong purposes and not primarily for describing the phenomenon ( Neergaard et al ., 2009 ). Kahlke (2014) argues that rather than being atheoretical, qualitative descriptive approaches require researchers to consider to what extent theory will inform the study and are sufficiently flexible to leave space for researchers to utilise theoretical frameworks that are relevant and inform individual research studies. Kim et al. (2017) reported that most descriptive studies reviewed did not identify a theoretical or philosophical framework, but those that did used it to inform the development of either the interview guide or the data analysis framework, thereby identifying the potential use of theory in descriptive designs.

Another challenge around the use of qualitative descriptive research is that it can erroneously be seen as a ‘quick fix’ for researchers who want to employ qualitative methods, but perhaps lack the expertise or familiarity with qualitative research ( Sandelowski, 2010 ). Kim et al. (2017) report how in their review fewer than half of qualitative descriptive papers explicitly identified a rationale for choosing this design, suggesting that in some cases the rationale behind its use was ill considered. Providing a justification for choosing a particular research design is an important part of the research process and, in the case of qualitative descriptive research, a clear justification can offset concerns that a descriptive design was an expedient rather than a measured choice. For studies exploring participants’ experiences, which could be addressed using other qualitative designs, it also helps to clearly make a distinction as to why a descriptive design was the best choice for the research study ( Kim et al ., 2017 ). Similarly, there is a perception that the data analysis techniques most commonly associated with descriptive research – thematic and content analysis are the ‘easiest’ approaches to qualitative analysis; however, as Vaismoradi et al . (2013) suggest, this does not mean they produce low-quality research findings.

As previously identified, a further challenge with the use of qualitative descriptive methods is that as a research design it has limited visibility in research texts and methodological papers ( Kim et al ., 2017 ). This means that novice qualitative researchers have little guidance on how to design and implement a descriptive study as there is a lack of a ‘methodological rulebook’ to guide researchers ( Kahlke, 2014 ). It is also suggested that this lack of strict boundaries and rules around qualitative descriptive research also offers researchers flexibility to design a study using a variety of data collection and analysis approaches that best answer the research question ( Kahlke, 2014 ; Kim et al . , 2017 ). However, should researchers choose to integrate methods ‘borrowed’ from other qualitative designs such as phenomenology or grounded theory, they should do so with the caveat that they do not claim they are using designs they are not actually using ( Neergaard et al . , 2009 ).

Examples of the use of qualitative descriptive research in healthcare

Findings from qualitative descriptive studies within healthcare have the potential to describe the experiences of patients, families and health providers, inform the development of health interventions and policy and promote health and quality of life ( Neergaard et al ., 2009 ; Willis et al ., 2016 ). The examples provided here demonstrate different ways qualitative descriptive methods can be used in a range of healthcare settings.

Simon et al. (2015) used a qualitative descriptive design to identify the perspectives of seriously ill, older patients and their families on the barriers and facilitators to advance care planning. The authors provided a rationale for using a descriptive design, which was to gain a deeper understanding of the phenomenon under investigation. Data were gathered through nine open-ended questions on a researcher-administered questionnaire. Responses to all questions were recorded verbatim and transcribed. Using descriptive, interpretative and explanatory coding that transformed raw data recorded from 278 patients and 225 family members to more abstract ideas and concepts ( Simon et al. , 2015 ), a deeper understanding of the barriers and facilitators to advance care planning was developed. Three categories were developed that identified personal beliefs, access to doctors and interaction with doctors as the central barriers and facilitators to advance care planning. The use of a qualitative descriptive design facilitated the development of a schematic based on these three themes, which provides a framework for use by clinicians to guide improvement in advance care planning.

Focus group interviews are a common data collection method in qualitative descriptive studies and were the method of choice in a study by Pelentsov et al. (2015), which sought to identify the supportive care needs of parents whose child has a rare disease. The rationale provided for using a qualitative descriptive design was to obtain a ‘straight description of the phenomena’ and to provide analysis and interpretation of the findings that remained data-near and representative of the responses of participants. In this study, four semi-structured focus group interviews were conducted with 23 parents. The data from these focus groups were then subjected to a form of thematic analysis during which emerging theories and inferences were identified and organised into a series of thematic networks and ultimately into three global themes. These themes identified that a number of factors including social isolation and lack of knowledge on behalf of healthcare professionals significantly affected how supported parents felt. Identifying key areas of the supportive needs of parents using qualitative description provides direction to health professionals on how best to respond to and support parents of children with a rare disease.

The potential for findings from a qualitative descriptive study to impact on policy was identified in a study by Syme et al. (2016) , who noted a lack of guidance and policies around sexual expression management of residents in long-term care settings. In this study, 20 directors of nursing from long-term care settings were interviewed with a view to identifying challenges in addressing sexual expression in these settings and elicit their recommendations for addressing these challenges in practice and policy. Following thematic analysis, findings relating to what directors of nursing believed to be important components of policy to address sexual expression were identified. These included providing educational resources, having a person-centred care delivery model when responding to sexual expression and providing guidance when working with families. Findings from this qualitative descriptive study provide recommendations that can then feed in to a broader policy on sexual expression in long-term care settings.

The final example of the use of a qualitative descriptive study comes from a mixed-methods study comprising a randomised control trial and a qualitative process evaluation. He et al. (2015) sought to determine the effects of a play intervention for children on parental perioperative anxiety and to explore parents’ perceptions of the intervention. Parents who had children going for surgery were assigned to a control group or an intervention group. The intervention group took part in a 1-hour play therapy session with their child whereas the control group received usual care. Quantitative findings identified there was no difference in parents’ anxiety levels between the intervention and control group. However, qualitative findings identified that parents found the intervention helpful in preparing both themselves and their child for surgery and perceived a reduction in their anxiety about the procedure thereby capturing findings that were not captured by the quantitative measures. In addition, in the qualitative interviews, parents made suggestions about how the play group could be improved, which provides important data for the further development of the intervention.

These examples across a range of healthcare settings provide evidence of the way findings from qualitative descriptive research can be directly used to more fully understand the experiences and perspectives of patients, their families and healthcare providers in addition to guiding future healthcare practice and informing further research.

Qualitative research designs have made significant contributions to the development of nursing and healthcare practices and policy. The utilisation of qualitative descriptive research is common within nursing research and is gaining popularity with other healthcare professions. This paper has identified that the utilisation of this design can be particularly relevant to nursing and healthcare professionals undertaking a primary piece of research and provides an excellent method to address issues that are of real clinical significance to them and their practice setting. However, the conundrum facing researchers who wish to use this approach is its lack of visibility and transparency within methodological papers and texts, resulting in a deficit of available information to researchers when designing such studies. By adding to the existing knowledge base, this paper enhances the information available to researchers who wish to use the qualitative descriptive approach, thus influencing the standard in how this approach is employed in healthcare research. We highlight the need for researchers using this research approach to clearly outline the context, theoretical framework and concepts underpinning it and the decision-making process that informed the design of their qualitative descriptive study including chosen research methods, and how these contribute to the achievement of the study’s aims and objectives. Failure to describe these issues may have a negative impact on study credibility. As seen in our paper, qualitative descriptive studies have a role in healthcare research providing insight into service users and providers’ perceptions and experiences of a particular phenomenon, which can inform healthcare service provision.

Key points for policy, practice and/or research

  • Despite its widespread use, there is little methodological guidance to orientate novice nurse researchers when using the qualitative descriptive design. This paper provides this guidance and champions the qualitative descriptive design as appropriate to explore research questions that require accessible and understandable findings directly relevant to healthcare practice and policy.
  • This paper identifies how the use of a qualitative descriptive design gives direct voice to participants including patients and healthcare staff, allowing exploration of issues of real and immediate importance in the practice area.
  • This paper reports how within qualitative descriptive research, the analysis of data and presentation of findings in a way that is easily understood and recognised is important to contribute to the utilisation of research findings in nursing practice.
  • As this design is often overlooked in research texts despite its suitability to exploring many healthcare questions, this paper adds to the limited methodological guidance and has utility for researchers who wish to defend their rationale for the use of the qualitative descriptive design in nursing and healthcare research.

Louise Doyle (PhD, MSc, BNS, RNT, RPN) is an Associate Professor in Mental Health Nursing at the School of Nursing and Midwifery, Trinity College Dublin. Her research interests are in the area of self-harm and suicide and she has a particular interest and expertise in mixed-methods and qualitative research designs.

Catherine McCabe (PhD, MSc, BNS, RNT, RGN) is an Associate Professor in General Nursing at the School of Nursing and Midwifery, Trinity College Dublin. Her research interests and expertise are in the areas of digital health (chronic disease self-management and social/cultural wellbeing), cancer, dementia, arts and health and systematic reviews.

Brian Keogh (PhD, MSc, BNS, RNT, RPN) is an Assistant Professor in Mental Health Nursing at the School of Nursing and Midwifery, Trinity College Dublin. His main area of research interest is mental health recovery and he specialises in qualitative research approaches with a particular emphasis on grounded theory.

Annemarie Brady (PhD, MSc, BNS, RNT, RPN) is Chair of Nursing and Chronic Illness and Head of School of Nursing and Midwifery at Trinity College Dublin. Her research work has focused on the development of healthcare systems and workforce solutions to respond to increased chronic illness demands within healthcare. She has conducted a range of mixed-method research studies in collaboration with health service providers to examine issues around patient-related outcomes measures, workload measurement, work conditions, practice development, patient safety and competency among healthcare workers.

Margaret McCann (PhD, MSc, BNS, RNT, RGN) is an Assistant Professor in General Nursing at the School of Nursing and Midwifery, Trinity College Dublin. Research interests are focused on chronic illness management, the use of digital health and smart technology in supporting patient/client education, self-management and independence. Other research interests include conducting systematic reviews, infection prevention and control and exploring patient outcomes linked to chronic kidney disease.

Declaration of conflicting interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship and/or publication of this article.

Ethical approval was not required for this paper as it is a methodological paper and does not report on participant data.

The author(s) received no financial support for the research, authorship and/or publication of this article.

Louise Doyle https://orcid.org/0000-0002-0153-8326

Margaret McCann https://orcid.org/0000-0002-7925-6396

  • Bishop FL. (2015) Using mixed methods in health research: Benefits and challenges . British Journal of Health Psychology 20 : 1–4. [ PubMed ] [ Google Scholar ]
  • Bradshaw C, Atkinson S, Doody O. (2017) Employing a qualitative description approach in health care research . Global Qualitative Nursing Research 4 : 1–8. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Braun V, Clarke V. (2006) Using thematic analysis in psychology . Qualitative Research in Psychology 3 : 77–101. [ Google Scholar ]
  • Caelli K, Ray L, Mill J. (2003) ‘Clear as mud’: Toward greater clarity in generic qualitative research . International Journal of Qualitative Methods 2 : 1–13. [ Google Scholar ]
  • Chafe R. (2017) The Value of Qualitative Description in Health Services and Policy Research . Healthcare Policy 12 : 12–18. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Doyle L, Brady AM, Byrne G. (2016) An overview of mixed methods research–revisited . Journal of Research in Nursing 21 : 623–635. [ Google Scholar ]
  • Finch H, Lewis J, Turley C. Ritchie J, Lewis J, McNaughton Nicholls C, Ormston R. Focus Groups . Qualitative Research Practice. A Guide for Social Science Students and Researchers , London: Sage, pp. 211–242. [ Google Scholar ]
  • Fusch PI, Ness LR. (2015) Are we there yet? Data saturation in qualitative research . The Qualitative Report 20 : 1408–1416. [ Google Scholar ]
  • Guest G, Bunce A, Johnson L. (2006) How many interviews are enough? An experiment with data saturation and variability . Field Methods 18 : 59–82. [ Google Scholar ]
  • Hallberg L. (2013) Quality criteria and generalization of results from qualitative studies . International Journal of Qualitative Studies in Health and Well-being 8 : 1. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • He HG, Zhu LX, Chan WCS, et al. (2015) A mixed-method study of effects of a therapeutic play intervention for children on parental anxiety and parents’ perceptions of the intervention . Journal of Advanced Nursing 71 ( 7 ): 1539–1551. [ PubMed ] [ Google Scholar ]
  • Hooley T, Wellens J, Marriott J. (2011) What is Online research? Using the Internet for Social Science Research , London: Bloomsbury Academic. [ Google Scholar ]
  • Howitt D. (2019) Introduction to Qualitative Methods in Psychology: Putting Theory into Practice , (4th edition). Harlow: Pearson Education Limited. [ Google Scholar ]
  • Janghorban R, Roudsari RL, Taghipour A. (2014) Skype interviewing: The new generation of online synchronous interview in qualitative research . International Journal of Qualitative Studies on Health and Wellbeing 9 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kahlke RM. (2014) Generic qualitative approaches: Pitfalls and benefits of methodological mixology . International Journal of Qualitative Methods 13 : 37–52. [ Google Scholar ]
  • Kim H, Sefcik JS, Bradway C. (2017) Characteristics of qualitative descriptive studies: A systematic review . Research in Nursing & Health 40 : 23–42. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • King N. (2012) Doing Template Analysis . In: Symon G, Cassell C. (eds) Qualitative Organizational Research: Core Methods and Current Challenges , Los Angeles, CA: Sage. [ Google Scholar ]
  • Krueger RA, Casey MA. (2009) Focus Groups: A Practical Guide for Applied Research , 4th ed. Thousand Oaks, CA: Sage. [ Google Scholar ]
  • Lambert VA, Lambert CE. (2012) Qualitative descriptive research: An acceptable design . Pacific Rim International Journal of Nursing Research 16 : 255–256. [ Google Scholar ]
  • Lincoln YS, Guba EG. (1985) Naturalistic Inquiry , Newbury Park, CA: Sage. [ Google Scholar ]
  • Lincoln YS, Lynham SA, Guba EG. (2017) Paradigmatic Controversies, Contradictions and Emerging Confluences . In: NK Denzin, YS Guba (ed) The Sage Handbook of Qualitative Research , (5th edition). Thousand Oaks, CA: Sage. [ Google Scholar ]
  • Long KM, McDermott F, Meadows GN. (2018) Being pragmatic about healthcare complexity: Our experiences applying complexity theory and pragmatism to health services research . BMC Medicine 16 : 94. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Malterud K, Siersma VD, Guassora AD. (2016) Sample size in qualitative interview studies: Guided by information power . Qualitative Health Research 26 ( 13 ): 1753–1760. [ PubMed ] [ Google Scholar ]
  • Neergaard MA, Olesen F, Andersen RS, et al. (2009) Qualitative description – the poor cousin of health research? BMC Medical Research Methodology 9 . [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Newell R, Burnard P. (2011) Research for Evidence Based Practice , Oxford: Blackwell Publishing. [ Google Scholar ]
  • O’Reilly M, Parker N. (2012) ‘Unsatisfactory Saturation’: A critical exploration of the notion of saturated sample sizes in qualitative research . Qualitative Research 13 ( 2 ): 190–197. [ Google Scholar ]
  • Ormston R, Spencer L, Barnard M, et al. (2014) The foundations of qualitative research . In: Ritchie J, Lewis J, McNaughton Nicholls C, Ormston R. (eds) Qualitative Research Practice. A Guide for Social Science Students and Researchers , London: Sage, pp. 1–25. [ Google Scholar ]
  • Palinkas LA, Horwitz SM, Green CA, et al. (2015) Purposeful sampling for qualitative data collection and analysis in mixed method implementation research . Administration and Policy in Mental Health and Mental Health Services Research 42 : 533–544. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Pelentsov LL, Fielder AL, Esterman AJ. (2016) The supportive care needs of parents with a child with a rare disease: A qualitative descriptive study . Journal of Pediatric Nursing 31 ( 3 ): e207–e218. [ PubMed ] [ Google Scholar ]
  • Ritchie J, Lewis J, Elam G, et al. (2014) Designing and selecting samples . In: Ritchie J, Lewis J, McNaughton Nicholls C, Ormston R. (eds) Qualitative Research Practice. A Guide for Social Science Students and Researchers , London: Sage, pp. 111–145. [ Google Scholar ]
  • Sandelowski M. (2000) Whatever happened to qualitative description? Research in Nursing & Health 23 : 334–340. [ PubMed ] [ Google Scholar ]
  • Sandelowski M. (2010) What’s in a name? Qualitative description revisited . Research in Nursing & Health 33 : 77–84. [ PubMed ] [ Google Scholar ]
  • Sandelowski M, Barroso J. (2003) Classifying the findings in qualitative studies . Qualitative Health Research 13 : 905–923. [ PubMed ] [ Google Scholar ]
  • Seixas BV, Smith N, Mitton C. (2018) The qualitative descriptive approach in international comparative studies: Using online qualitative surveys . International Journal of Health Policy Management 7 ( 9 ): 778–781. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Simon J, Porterfield P, Bouchal SR, et al. (2015) ‘Not yet’ and ‘just ask’: Barriers and facilitators to advance care planning – a qualitative descriptive study of the perspectives of seriously ill, older patients and their families . BMJ Supportive & Palliative Care 5 : 54–62. [ PubMed ] [ Google Scholar ]
  • Syme ML, Lichtenberg P, Moye J. (2016) Recommendations for sexual expression management in long-term care: A qualitative needs assessment . Journal of Advanced Nursing 72 ( 10 ): 2457–2467. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Teddlie C, Yu F. (2007) Mixed methods sampling: A typology with examples . Journal of Mixed Methods Research 1 : 77–100. [ Google Scholar ]
  • Vaismoradi M, Turunen H, Bondas T. (2013) Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study . Nursing & Health Sciences 15 : 398–405. [ PubMed ] [ Google Scholar ]
  • Ward K, Gott M, Hoare K. (2015) Participants’ views of telephone interviews within a grounded theory study . Journal of Advanced Nursing 71 : 2775–2785. [ PubMed ] [ Google Scholar ]
  • Willis DG, Sullivan-Bolyai S, Knafl K, et al. (2016) Distinguishing features and similarities between descriptive phenomenological and qualitative descriptive research . Western Journal of Nursing Research 38 : 1185–1204. [ PubMed ] [ Google Scholar ]

IMAGES

  1. What is Research Design in Qualitative Research

    description of qualitative research design

  2. Qualitative Research: Definition, Types, Methods and Examples

    description of qualitative research design

  3. Understanding Qualitative Research: An In-Depth Study Guide

    description of qualitative research design

  4. 5 Types Of Qualitative Research Designs

    description of qualitative research design

  5. Qualitative Research

    description of qualitative research design

  6. 6 Types of Qualitative Research Methods

    description of qualitative research design

VIDEO

  1. Different types of Research Designs|Quantitative|Qualitative|English| part 1|

  2. PRACTICAL RESEARCH 1

  3. Research Designs: Part 2 of 3: Qualitative Research Designs (ሪሰርች ዲዛይን

  4. Qualitative Research Designs

  5. QUANTITATIVE METHODOLOGY (Part 2 of 3):

  6. Quantitative & Qualitative Research Design and Citation, Impact Factor

COMMENTS

  1. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  2. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  3. What is Qualitative Research Design? Definition, Types, Methods and

    Qualitative research design is defined as a type of research methodology that focuses on exploring and understanding complex phenomena and the meanings attributed to them by individuals or groups. Learn more about qualitative research design types, methods and best practices. ... Description; cookielawinfo-checbox-analytics: 11 months:

  4. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  5. Chapter 2. Research Design

    Chapter 2. Research Design Getting Started. When I teach undergraduates qualitative research methods, the final product of the course is a "research proposal" that incorporates all they have learned and enlists the knowledge they have learned about qualitative research methods in an original design that addresses a particular research question.

  6. What is Qualitative in Qualitative Research

    Qualitative research involves the studied use and collection of a variety of empirical materials - case study, personal experience, introspective, life story, interview, observational, historical, interactional, and visual texts - that describe routine and problematic moments and meanings in individuals' lives.

  7. Qualitative Study

    Qualitative research is a type of research that explores and provides deeper insights into real-world problems.[1] Instead of collecting numerical data points or intervene or introduce treatments just like in quantitative research, qualitative research helps generate hypotheses as well as further investigate and understand quantitative data. Qualitative research gathers participants ...

  8. 20

    In other words, qualitative research uncovers social processes and mechanisms undergirding human behavior. In this chapter, we will discuss how to design a qualitative research project using two of the most common qualitative research methods: in-depth interviewing and ethnographic observations (also known as ethnography or participant ...

  9. Characteristics of Qualitative Descriptive Studies: A Systematic Review

    Qualitative description (QD) is a label used in qualitative research for studies which are descriptive in nature, particularly for examining health care and nursing-related phenomena (Polit & Beck, 2009, 2014).QD is a widely cited research tradition and has been identified as important and appropriate for research questions focused on discovering the who, what, and where of events or ...

  10. Start

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  11. Qualitative Design Research Methods

    The Origins of Design-Based Research. Qualitative design-based research (DBR) first emerged in the learning sciences field among a group of scholars in the early 1990s, with the first articulation of DBR as a distinct methodological construct appearing in the work of Ann Brown and Allan Collins ().For learning scientists in the 1970s and 1980s, the traditional methodologies of laboratory ...

  12. Characteristics of Qualitative Research

    Qualitative research is a method of inquiry used in various disciplines, including social sciences, education, and health, to explore and understand human behavior, experiences, and social phenomena. It focuses on collecting non-numerical data, such as words, images, or objects, to gain in-depth insights into people's thoughts, feelings, motivations, and perspectives.

  13. Qualitative Description as an Introductory Method to Qualitative

    Qualitative description (QD) offers an accessible entry point for master's-level students and research trainees embarking on a qualitative research learning journey, emphasizing direct, rich descri...

  14. PDF Essentials of Descriptive-Interpretive Qualitative Research: A Generic

    In this particular book, we present descriptive-interpretive qualitative research by Robert Elliott and Ladislav Timulak. This generic approach is the culmination of many years of method development and research by these authors, who were pioneers in introducing qualitative research to the psycho-therapy field.

  15. Qualitative Research

    Qualitative research is the methodology researchers use to gain deep contextual understandings of users via non-numerical means and direct observations. Researchers focus on smaller user samples—e.g., in interviews—to reveal data such as user attitudes, behaviors and hidden factors: insights which guide better designs.

  16. Employing a Qualitative Description Approach in Health Care Research

    Exploratory research is the umbrella term used by Brink and Wood (2001) to describe all description qualitative research and suggest it "is a Level 1 research endeavor" (p. 85), and Savin-Baden and Howell Major (2013) refer to a pragmatic qualitative approach. This interchangeable use of terms creates ambiguity and confusion in relation to ...

  17. Employing a Qualitative Description Approach in Health Care Research

    Within the literature, various terms have been used to describe research that does not fit within a traditional qualitative approach. Thorne, Kirkham, and MacDonald-Emes (1997) define "interpretive description" as a "noncategorical" qualitative research approach (p. 169). Merriam (1998) refers to this type of research as "basic or ...

  18. Chapter 5: Qualitative descriptive research

    They emerged predominantly in the field of nursing and midwifery over the past two decades. 1 The design of qualitative descriptive studies evolved as a means to define aspects of qualitative ... Bradshaw C, Atkinson S, Doody O. Employing a qualitative description approach in health care research. Glob Qual Nurs Res. 2017;4. doi:10.1177 ...

  19. Qualitative Descriptive Design

    Overview of Descriptive Design. A descriptive design is a flexible, exploratory approach to qualitative research. Descriptive design is referred to in the literature by other labels including generic, general, basic, traditional, interpretive, and pragmatic. Descriptive design as an acceptable research design for dissertation and other robust ...

  20. Qualitative Descriptive Methods in Health Science Research

    Describing the Qualitative Descriptive Approach. In two seminal articles, Sandelowski promotes the mainstream use of qualitative description (Sandelowski, 2000, 2010) as a well-developed but unacknowledged method which provides a "comprehensive summary of an event in the every day terms of those events" (Sandelowski, 2000, p. 336).Such studies are characterized by lower levels of ...

  21. An overview of the qualitative descriptive design within nursing research

    It can be difficult to clearly differentiate what constitutes a descriptive research design from the range of other methodologies at the disposal of qualitative researchers. Aims This paper provides an overview of qualitative descriptive research, orientates to the underlying philosophical perspectives and key characteristics that define this ...

  22. A Brief Introduction to Qualitative Description: A Research Design

    Qualitative description has both benefits and limitations and has a firm place in the world of research generally, and not just health research. Qualitative description is well suited to studies that involve mixed methods or questionnaire design, or where there is a need to develop straight forward and first hand description of the facts of the ...

  23. An overview of the qualitative descriptive design within nursing research

    As with any research design, ensuring the rigor or trustworthiness of findings from a qualitative descriptive study is crucial. ... Qualitative description revisited. Research in Nursing & Health 33: 77-84. [Google Scholar] Sandelowski M, Barroso J. (2003) Classifying the findings in qualitative studies. Qualitative Health Research 13: 905-923.