Grad Coach

Research Design 101

Everything You Need To Get Started (With Examples)

By: Derek Jansen (MBA) | Reviewers: Eunice Rautenbach (DTech) & Kerryn Warren (PhD) | April 2023

Research design for qualitative and quantitative studies

Navigating the world of research can be daunting, especially if you’re a first-time researcher. One concept you’re bound to run into fairly early in your research journey is that of “ research design ”. Here, we’ll guide you through the basics using practical examples , so that you can approach your research with confidence.

Overview: Research Design 101

What is research design.

  • Research design types for quantitative studies
  • Video explainer : quantitative research design
  • Research design types for qualitative studies
  • Video explainer : qualitative research design
  • How to choose a research design
  • Key takeaways

Research design refers to the overall plan, structure or strategy that guides a research project , from its conception to the final data analysis. A good research design serves as the blueprint for how you, as the researcher, will collect and analyse data while ensuring consistency, reliability and validity throughout your study.

Understanding different types of research designs is essential as helps ensure that your approach is suitable  given your research aims, objectives and questions , as well as the resources you have available to you. Without a clear big-picture view of how you’ll design your research, you run the risk of potentially making misaligned choices in terms of your methodology – especially your sampling , data collection and data analysis decisions.

The problem with defining research design…

One of the reasons students struggle with a clear definition of research design is because the term is used very loosely across the internet, and even within academia.

Some sources claim that the three research design types are qualitative, quantitative and mixed methods , which isn’t quite accurate (these just refer to the type of data that you’ll collect and analyse). Other sources state that research design refers to the sum of all your design choices, suggesting it’s more like a research methodology . Others run off on other less common tangents. No wonder there’s confusion!

In this article, we’ll clear up the confusion. We’ll explain the most common research design types for both qualitative and quantitative research projects, whether that is for a full dissertation or thesis, or a smaller research paper or article.

Free Webinar: Research Methodology 101

Research Design: Quantitative Studies

Quantitative research involves collecting and analysing data in a numerical form. Broadly speaking, there are four types of quantitative research designs: descriptive , correlational , experimental , and quasi-experimental . 

Descriptive Research Design

As the name suggests, descriptive research design focuses on describing existing conditions, behaviours, or characteristics by systematically gathering information without manipulating any variables. In other words, there is no intervention on the researcher’s part – only data collection.

For example, if you’re studying smartphone addiction among adolescents in your community, you could deploy a survey to a sample of teens asking them to rate their agreement with certain statements that relate to smartphone addiction. The collected data would then provide insight regarding how widespread the issue may be – in other words, it would describe the situation.

The key defining attribute of this type of research design is that it purely describes the situation . In other words, descriptive research design does not explore potential relationships between different variables or the causes that may underlie those relationships. Therefore, descriptive research is useful for generating insight into a research problem by describing its characteristics . By doing so, it can provide valuable insights and is often used as a precursor to other research design types.

Correlational Research Design

Correlational design is a popular choice for researchers aiming to identify and measure the relationship between two or more variables without manipulating them . In other words, this type of research design is useful when you want to know whether a change in one thing tends to be accompanied by a change in another thing.

For example, if you wanted to explore the relationship between exercise frequency and overall health, you could use a correlational design to help you achieve this. In this case, you might gather data on participants’ exercise habits, as well as records of their health indicators like blood pressure, heart rate, or body mass index. Thereafter, you’d use a statistical test to assess whether there’s a relationship between the two variables (exercise frequency and health).

As you can see, correlational research design is useful when you want to explore potential relationships between variables that cannot be manipulated or controlled for ethical, practical, or logistical reasons. It is particularly helpful in terms of developing predictions , and given that it doesn’t involve the manipulation of variables, it can be implemented at a large scale more easily than experimental designs (which will look at next).

That said, it’s important to keep in mind that correlational research design has limitations – most notably that it cannot be used to establish causality . In other words, correlation does not equal causation . To establish causality, you’ll need to move into the realm of experimental design, coming up next…

Need a helping hand?

research design and methodology definition

Experimental Research Design

Experimental research design is used to determine if there is a causal relationship between two or more variables . With this type of research design, you, as the researcher, manipulate one variable (the independent variable) while controlling others (dependent variables). Doing so allows you to observe the effect of the former on the latter and draw conclusions about potential causality.

For example, if you wanted to measure if/how different types of fertiliser affect plant growth, you could set up several groups of plants, with each group receiving a different type of fertiliser, as well as one with no fertiliser at all. You could then measure how much each plant group grew (on average) over time and compare the results from the different groups to see which fertiliser was most effective.

Overall, experimental research design provides researchers with a powerful way to identify and measure causal relationships (and the direction of causality) between variables. However, developing a rigorous experimental design can be challenging as it’s not always easy to control all the variables in a study. This often results in smaller sample sizes , which can reduce the statistical power and generalisability of the results.

Moreover, experimental research design requires random assignment . This means that the researcher needs to assign participants to different groups or conditions in a way that each participant has an equal chance of being assigned to any group (note that this is not the same as random sampling ). Doing so helps reduce the potential for bias and confounding variables . This need for random assignment can lead to ethics-related issues . For example, withholding a potentially beneficial medical treatment from a control group may be considered unethical in certain situations.

Quasi-Experimental Research Design

Quasi-experimental research design is used when the research aims involve identifying causal relations , but one cannot (or doesn’t want to) randomly assign participants to different groups (for practical or ethical reasons). Instead, with a quasi-experimental research design, the researcher relies on existing groups or pre-existing conditions to form groups for comparison.

For example, if you were studying the effects of a new teaching method on student achievement in a particular school district, you may be unable to randomly assign students to either group and instead have to choose classes or schools that already use different teaching methods. This way, you still achieve separate groups, without having to assign participants to specific groups yourself.

Naturally, quasi-experimental research designs have limitations when compared to experimental designs. Given that participant assignment is not random, it’s more difficult to confidently establish causality between variables, and, as a researcher, you have less control over other variables that may impact findings.

All that said, quasi-experimental designs can still be valuable in research contexts where random assignment is not possible and can often be undertaken on a much larger scale than experimental research, thus increasing the statistical power of the results. What’s important is that you, as the researcher, understand the limitations of the design and conduct your quasi-experiment as rigorously as possible, paying careful attention to any potential confounding variables .

The four most common quantitative research design types are descriptive, correlational, experimental and quasi-experimental.

Research Design: Qualitative Studies

There are many different research design types when it comes to qualitative studies, but here we’ll narrow our focus to explore the “Big 4”. Specifically, we’ll look at phenomenological design, grounded theory design, ethnographic design, and case study design.

Phenomenological Research Design

Phenomenological design involves exploring the meaning of lived experiences and how they are perceived by individuals. This type of research design seeks to understand people’s perspectives , emotions, and behaviours in specific situations. Here, the aim for researchers is to uncover the essence of human experience without making any assumptions or imposing preconceived ideas on their subjects.

For example, you could adopt a phenomenological design to study why cancer survivors have such varied perceptions of their lives after overcoming their disease. This could be achieved by interviewing survivors and then analysing the data using a qualitative analysis method such as thematic analysis to identify commonalities and differences.

Phenomenological research design typically involves in-depth interviews or open-ended questionnaires to collect rich, detailed data about participants’ subjective experiences. This richness is one of the key strengths of phenomenological research design but, naturally, it also has limitations. These include potential biases in data collection and interpretation and the lack of generalisability of findings to broader populations.

Grounded Theory Research Design

Grounded theory (also referred to as “GT”) aims to develop theories by continuously and iteratively analysing and comparing data collected from a relatively large number of participants in a study. It takes an inductive (bottom-up) approach, with a focus on letting the data “speak for itself”, without being influenced by preexisting theories or the researcher’s preconceptions.

As an example, let’s assume your research aims involved understanding how people cope with chronic pain from a specific medical condition, with a view to developing a theory around this. In this case, grounded theory design would allow you to explore this concept thoroughly without preconceptions about what coping mechanisms might exist. You may find that some patients prefer cognitive-behavioural therapy (CBT) while others prefer to rely on herbal remedies. Based on multiple, iterative rounds of analysis, you could then develop a theory in this regard, derived directly from the data (as opposed to other preexisting theories and models).

Grounded theory typically involves collecting data through interviews or observations and then analysing it to identify patterns and themes that emerge from the data. These emerging ideas are then validated by collecting more data until a saturation point is reached (i.e., no new information can be squeezed from the data). From that base, a theory can then be developed .

As you can see, grounded theory is ideally suited to studies where the research aims involve theory generation , especially in under-researched areas. Keep in mind though that this type of research design can be quite time-intensive , given the need for multiple rounds of data collection and analysis.

research design and methodology definition

Ethnographic Research Design

Ethnographic design involves observing and studying a culture-sharing group of people in their natural setting to gain insight into their behaviours, beliefs, and values. The focus here is on observing participants in their natural environment (as opposed to a controlled environment). This typically involves the researcher spending an extended period of time with the participants in their environment, carefully observing and taking field notes .

All of this is not to say that ethnographic research design relies purely on observation. On the contrary, this design typically also involves in-depth interviews to explore participants’ views, beliefs, etc. However, unobtrusive observation is a core component of the ethnographic approach.

As an example, an ethnographer may study how different communities celebrate traditional festivals or how individuals from different generations interact with technology differently. This may involve a lengthy period of observation, combined with in-depth interviews to further explore specific areas of interest that emerge as a result of the observations that the researcher has made.

As you can probably imagine, ethnographic research design has the ability to provide rich, contextually embedded insights into the socio-cultural dynamics of human behaviour within a natural, uncontrived setting. Naturally, however, it does come with its own set of challenges, including researcher bias (since the researcher can become quite immersed in the group), participant confidentiality and, predictably, ethical complexities . All of these need to be carefully managed if you choose to adopt this type of research design.

Case Study Design

With case study research design, you, as the researcher, investigate a single individual (or a single group of individuals) to gain an in-depth understanding of their experiences, behaviours or outcomes. Unlike other research designs that are aimed at larger sample sizes, case studies offer a deep dive into the specific circumstances surrounding a person, group of people, event or phenomenon, generally within a bounded setting or context .

As an example, a case study design could be used to explore the factors influencing the success of a specific small business. This would involve diving deeply into the organisation to explore and understand what makes it tick – from marketing to HR to finance. In terms of data collection, this could include interviews with staff and management, review of policy documents and financial statements, surveying customers, etc.

While the above example is focused squarely on one organisation, it’s worth noting that case study research designs can have different variation s, including single-case, multiple-case and longitudinal designs. As you can see in the example, a single-case design involves intensely examining a single entity to understand its unique characteristics and complexities. Conversely, in a multiple-case design , multiple cases are compared and contrasted to identify patterns and commonalities. Lastly, in a longitudinal case design , a single case or multiple cases are studied over an extended period of time to understand how factors develop over time.

As you can see, a case study research design is particularly useful where a deep and contextualised understanding of a specific phenomenon or issue is desired. However, this strength is also its weakness. In other words, you can’t generalise the findings from a case study to the broader population. So, keep this in mind if you’re considering going the case study route.

Case study design often involves investigating an individual to gain an in-depth understanding of their experiences, behaviours or outcomes.

How To Choose A Research Design

Having worked through all of these potential research designs, you’d be forgiven for feeling a little overwhelmed and wondering, “ But how do I decide which research design to use? ”. While we could write an entire post covering that alone, here are a few factors to consider that will help you choose a suitable research design for your study.

Data type: The first determining factor is naturally the type of data you plan to be collecting – i.e., qualitative or quantitative. This may sound obvious, but we have to be clear about this – don’t try to use a quantitative research design on qualitative data (or vice versa)!

Research aim(s) and question(s): As with all methodological decisions, your research aim and research questions will heavily influence your research design. For example, if your research aims involve developing a theory from qualitative data, grounded theory would be a strong option. Similarly, if your research aims involve identifying and measuring relationships between variables, one of the experimental designs would likely be a better option.

Time: It’s essential that you consider any time constraints you have, as this will impact the type of research design you can choose. For example, if you’ve only got a month to complete your project, a lengthy design such as ethnography wouldn’t be a good fit.

Resources: Take into account the resources realistically available to you, as these need to factor into your research design choice. For example, if you require highly specialised lab equipment to execute an experimental design, you need to be sure that you’ll have access to that before you make a decision.

Keep in mind that when it comes to research, it’s important to manage your risks and play as conservatively as possible. If your entire project relies on you achieving a huge sample, having access to niche equipment or holding interviews with very difficult-to-reach participants, you’re creating risks that could kill your project. So, be sure to think through your choices carefully and make sure that you have backup plans for any existential risks. Remember that a relatively simple methodology executed well generally will typically earn better marks than a highly-complex methodology executed poorly.

research design and methodology definition

Recap: Key Takeaways

We’ve covered a lot of ground here. Let’s recap by looking at the key takeaways:

  • Research design refers to the overall plan, structure or strategy that guides a research project, from its conception to the final analysis of data.
  • Research designs for quantitative studies include descriptive , correlational , experimental and quasi-experimenta l designs.
  • Research designs for qualitative studies include phenomenological , grounded theory , ethnographic and case study designs.
  • When choosing a research design, you need to consider a variety of factors, including the type of data you’ll be working with, your research aims and questions, your time and the resources available to you.

If you need a helping hand with your research design (or any other aspect of your research), check out our private coaching services .

research design and methodology definition

Psst... there’s more!

This post was based on one of our popular Research Bootcamps . If you're working on a research project, you'll definitely want to check this out ...

You Might Also Like:

Survey Design 101: The Basics

Is there any blog article explaining more on Case study research design? Is there a Case study write-up template? Thank you.

Solly Khan

Thanks this was quite valuable to clarify such an important concept.

hetty

Thanks for this simplified explanations. it is quite very helpful.

Belz

This was really helpful. thanks

Imur

Thank you for your explanation. I think case study research design and the use of secondary data in researches needs to be talked about more in your videos and articles because there a lot of case studies research design tailored projects out there.

Please is there any template for a case study research design whose data type is a secondary data on your repository?

Sam Msongole

This post is very clear, comprehensive and has been very helpful to me. It has cleared the confusion I had in regard to research design and methodology.

Robyn Pritchard

This post is helpful, easy to understand, and deconstructs what a research design is. Thanks

kelebogile

how to cite this page

Peter

Thank you very much for the post. It is wonderful and has cleared many worries in my mind regarding research designs. I really appreciate .

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 22 April 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

Open Access is an initiative that aims to make scientific research freely available to all. To date our community has made over 100 million downloads. It’s based on principles of collaboration, unobstructed discovery, and, most importantly, scientific progression. As PhD students, we found it difficult to access the research we needed, so we decided to create a new Open Access publisher that levels the playing field for scientists across the world. How? By making research easy to access, and puts the academic needs of the researchers before the business interests of publishers.

We are a community of more than 103,000 authors and editors from 3,291 institutions spanning 160 countries, including Nobel Prize winners and some of the world’s most-cited researchers. Publishing on IntechOpen allows authors to earn citations and find new collaborators, meaning more people see your work not only from your own field of study, but from other related fields too.

Brief introduction to this section that descibes Open Access especially from an IntechOpen perspective

Want to get in touch? Contact our London head office or media team here

Our team is growing all the time, so we’re always on the lookout for smart people who want to help us reshape the world of scientific publishing.

Home > Books > Cyberspace

Research Design and Methodology

Submitted: 23 January 2019 Reviewed: 08 March 2019 Published: 07 August 2019

DOI: 10.5772/intechopen.85731

Cite this chapter

There are two ways to cite this chapter:

From the Edited Volume

Edited by Evon Abu-Taieh, Abdelkrim El Mouatasim and Issam H. Al Hadid

To purchase hard copies of this book, please contact the representative in India: CBS Publishers & Distributors Pvt. Ltd. www.cbspd.com | [email protected]

Chapter metrics overview

30,855 Chapter Downloads

Impact of this chapter

Total Chapter Downloads on intechopen.com

IntechOpen

Total Chapter Views on intechopen.com

Overall attention for this chapters

There are a number of approaches used in this research method design. The purpose of this chapter is to design the methodology of the research approach through mixed types of research techniques. The research approach also supports the researcher on how to come across the research result findings. In this chapter, the general design of the research and the methods used for data collection are explained in detail. It includes three main parts. The first part gives a highlight about the dissertation design. The second part discusses about qualitative and quantitative data collection methods. The last part illustrates the general research framework. The purpose of this section is to indicate how the research was conducted throughout the study periods.

  • research design
  • methodology
  • data sources

Author Information

Kassu jilcha sileyew *.

  • School of Mechanical and Industrial Engineering, Addis Ababa Institute of Technology, Addis Ababa University, Addis Ababa, Ethiopia

*Address all correspondence to: [email protected]

1. Introduction

Research methodology is the path through which researchers need to conduct their research. It shows the path through which these researchers formulate their problem and objective and present their result from the data obtained during the study period. This research design and methodology chapter also shows how the research outcome at the end will be obtained in line with meeting the objective of the study. This chapter hence discusses the research methods that were used during the research process. It includes the research methodology of the study from the research strategy to the result dissemination. For emphasis, in this chapter, the author outlines the research strategy, research design, research methodology, the study area, data sources such as primary data sources and secondary data, population consideration and sample size determination such as questionnaires sample size determination and workplace site exposure measurement sample determination, data collection methods like primary data collection methods including workplace site observation data collection and data collection through desk review, data collection through questionnaires, data obtained from experts opinion, workplace site exposure measurement, data collection tools pretest, secondary data collection methods, methods of data analysis used such as quantitative data analysis and qualitative data analysis, data analysis software, the reliability and validity analysis of the quantitative data, reliability of data, reliability analysis, validity, data quality management, inclusion criteria, ethical consideration and dissemination of result and its utilization approaches. In order to satisfy the objectives of the study, a qualitative and quantitative research method is apprehended in general. The study used these mixed strategies because the data were obtained from all aspects of the data source during the study time. Therefore, the purpose of this methodology is to satisfy the research plan and target devised by the researcher.

2. Research design

The research design is intended to provide an appropriate framework for a study. A very significant decision in research design process is the choice to be made regarding research approach since it determines how relevant information for a study will be obtained; however, the research design process involves many interrelated decisions [ 1 ].

This study employed a mixed type of methods. The first part of the study consisted of a series of well-structured questionnaires (for management, employee’s representatives, and technician of industries) and semi-structured interviews with key stakeholders (government bodies, ministries, and industries) in participating organizations. The other design used is an interview of employees to know how they feel about safety and health of their workplace, and field observation at the selected industrial sites was undertaken.

Hence, this study employs a descriptive research design to agree on the effects of occupational safety and health management system on employee health, safety, and property damage for selected manufacturing industries. Saunders et al. [ 2 ] and Miller [ 3 ] say that descriptive research portrays an accurate profile of persons, events, or situations. This design offers to the researchers a profile of described relevant aspects of the phenomena of interest from an individual, organizational, and industry-oriented perspective. Therefore, this research design enabled the researchers to gather data from a wide range of respondents on the impact of safety and health on manufacturing industries in Ethiopia. And this helped in analyzing the response obtained on how it affects the manufacturing industries’ workplace safety and health. The research overall design and flow process are depicted in Figure 1 .

research design and methodology definition

Research methods and processes (author design).

3. Research methodology

To address the key research objectives, this research used both qualitative and quantitative methods and combination of primary and secondary sources. The qualitative data supports the quantitative data analysis and results. The result obtained is triangulated since the researcher utilized the qualitative and quantitative data types in the data analysis. The study area, data sources, and sampling techniques were discussed under this section.

3.1 The study area

According to Fraenkel and Warren [ 4 ] studies, population refers to the complete set of individuals (subjects or events) having common characteristics in which the researcher is interested. The population of the study was determined based on random sampling system. This data collection was conducted from March 07, 2015 to December 10, 2016, from selected manufacturing industries found in Addis Ababa city and around. The manufacturing companies were selected based on their employee number, established year, and the potential accidents prevailing and the manufacturing industry type even though all criterions were difficult to satisfy.

3.2 Data sources

3.2.1 primary data sources.

It was obtained from the original source of information. The primary data were more reliable and have more confidence level of decision-making with the trusted analysis having direct intact with occurrence of the events. The primary data sources are industries’ working environment (through observation, pictures, and photograph) and industry employees (management and bottom workers) (interview, questionnaires and discussions).

3.2.2 Secondary data

Desk review has been conducted to collect data from various secondary sources. This includes reports and project documents at each manufacturing sectors (more on medium and large level). Secondary data sources have been obtained from literatures regarding OSH, and the remaining data were from the companies’ manuals, reports, and some management documents which were included under the desk review. Reputable journals, books, different articles, periodicals, proceedings, magazines, newsletters, newspapers, websites, and other sources were considered on the manufacturing industrial sectors. The data also obtained from the existing working documents, manuals, procedures, reports, statistical data, policies, regulations, and standards were taken into account for the review.

In general, for this research study, the desk review has been completed to this end, and it had been polished and modified upon manuals and documents obtained from the selected companies.

4. Population and sample size

4.1 population.

The study population consisted of manufacturing industries’ employees in Addis Ababa city and around as there are more representative manufacturing industrial clusters found. To select representative manufacturing industrial sector population, the types of the industries expected were more potential to accidents based on random and purposive sampling considered. The population of data was from textile, leather, metal, chemicals, and food manufacturing industries. A total of 189 sample sizes of industries responded to the questionnaire survey from the priority areas of the government. Random sample sizes and disproportionate methods were used, and 80 from wood, metal, and iron works; 30 from food, beverage, and tobacco products; 50 from leather, textile, and garments; 20 from chemical and chemical products; and 9 from other remaining 9 clusters of manufacturing industries responded.

4.2 Questionnaire sample size determination

A simple random sampling and purposive sampling methods were used to select the representative manufacturing industries and respondents for the study. The simple random sampling ensures that each member of the population has an equal chance for the selection or the chance of getting a response which can be more than equal to the chance depending on the data analysis justification. Sample size determination procedure was used to get optimum and reasonable information. In this study, both probability (simple random sampling) and nonprobability (convenience, quota, purposive, and judgmental) sampling methods were used as the nature of the industries are varied. This is because of the characteristics of data sources which permitted the researchers to follow the multi-methods. This helps the analysis to triangulate the data obtained and increase the reliability of the research outcome and its decision. The companies’ establishment time and its engagement in operation, the number of employees and the proportion it has, the owner types (government and private), type of manufacturing industry/production, types of resource used at work, and the location it is found in the city and around were some of the criteria for the selections.

The determination of the sample size was adopted from Daniel [ 5 ] and Cochran [ 6 ] formula. The formula used was for unknown population size Eq. (1) and is given as

research design and methodology definition

where n  = sample size, Z  = statistic for a level of confidence, P  = expected prevalence or proportion (in proportion of one; if 50%, P  = 0.5), and d  = precision (in proportion of one; if 6%, d  = 0.06). Z statistic ( Z ): for the level of confidence of 95%, which is conventional, Z value is 1.96. In this study, investigators present their results with 95% confidence intervals (CI).

The expected sample number was 267 at the marginal error of 6% for 95% confidence interval of manufacturing industries. However, the collected data indicated that only 189 populations were used for the analysis after rejecting some data having more missing values in the responses from the industries. Hence, the actual data collection resulted in 71% response rate. The 267 population were assumed to be satisfactory and representative for the data analysis.

4.3 Workplace site exposure measurement sample determination

The sample size for the experimental exposure measurements of physical work environment has been considered based on the physical data prepared for questionnaires and respondents. The response of positive were considered for exposure measurement factors to be considered for the physical environment health and disease causing such as noise intensity, light intensity, pressure/stress, vibration, temperature/coldness, or hotness and dust particles on 20 workplace sites. The selection method was using random sampling in line with purposive method. The measurement of the exposure factors was done in collaboration with Addis Ababa city Administration and Oromia Bureau of Labour and Social Affair (AACBOLSA). Some measuring instruments were obtained from the Addis Ababa city and Oromia Bureau of Labour and Social Affair.

5. Data collection methods

Data collection methods were focused on the followings basic techniques. These included secondary and primary data collections focusing on both qualitative and quantitative data as defined in the previous section. The data collection mechanisms are devised and prepared with their proper procedures.

5.1 Primary data collection methods

Primary data sources are qualitative and quantitative. The qualitative sources are field observation, interview, and informal discussions, while that of quantitative data sources are survey questionnaires and interview questions. The next sections elaborate how the data were obtained from the primary sources.

5.1.1 Workplace site observation data collection

Observation is an important aspect of science. Observation is tightly connected to data collection, and there are different sources for this: documentation, archival records, interviews, direct observations, and participant observations. Observational research findings are considered strong in validity because the researcher is able to collect a depth of information about a particular behavior. In this dissertation, the researchers used observation method as one tool for collecting information and data before questionnaire design and after the start of research too. The researcher made more than 20 specific observations of manufacturing industries in the study areas. During the observations, it found a deeper understanding of the working environment and the different sections in the production system and OSH practices.

5.1.2 Data collection through interview

Interview is a loosely structured qualitative in-depth interview with people who are considered to be particularly knowledgeable about the topic of interest. The semi-structured interview is usually conducted in a face-to-face setting which permits the researcher to seek new insights, ask questions, and assess phenomena in different perspectives. It let the researcher to know the in-depth of the present working environment influential factors and consequences. It has provided opportunities for refining data collection efforts and examining specialized systems or processes. It was used when the researcher faces written records or published document limitation or wanted to triangulate the data obtained from other primary and secondary data sources.

This dissertation is also conducted with a qualitative approach and conducting interviews. The advantage of using interviews as a method is that it allows respondents to raise issues that the interviewer may not have expected. All interviews with employees, management, and technicians were conducted by the corresponding researcher, on a face-to-face basis at workplace. All interviews were recorded and transcribed.

5.1.3 Data collection through questionnaires

The main tool for gaining primary information in practical research is questionnaires, due to the fact that the researcher can decide on the sample and the types of questions to be asked [ 2 ].

In this dissertation, each respondent is requested to reply to an identical list of questions mixed so that biasness was prevented. Initially the questionnaire design was coded and mixed up from specific topic based on uniform structures. Consequently, the questionnaire produced valuable data which was required to achieve the dissertation objectives.

The questionnaires developed were based on a five-item Likert scale. Responses were given to each statement using a five-point Likert-type scale, for which 1 = “strongly disagree” to 5 = “strongly agree.” The responses were summed up to produce a score for the measures.

5.1.4 Data obtained from experts’ opinion

The data was also obtained from the expert’s opinion related to the comparison of the knowledge, management, collaboration, and technology utilization including their sub-factors. The data obtained in this way was used for prioritization and decision-making of OSH, improving factor priority. The prioritization of the factors was using Saaty scales (1–9) and then converting to Fuzzy set values obtained from previous researches using triangular fuzzy set [ 7 ].

5.1.5 Workplace site exposure measurement

The researcher has measured the workplace environment for dust, vibration, heat, pressure, light, and noise to know how much is the level of each variable. The primary data sources planned and an actual coverage has been compared as shown in Table 1 .

research design and methodology definition

Planned versus actual coverage of the survey.

The response rate for the proposed data source was good, and the pilot test also proved the reliability of questionnaires. Interview/discussion resulted in 87% of responses among the respondents; the survey questionnaire response rate obtained was 71%, and the field observation response rate was 90% for the whole data analysis process. Hence, the data organization quality level has not been compromised.

This response rate is considered to be representative of studies of organizations. As the study agrees on the response rate to be 30%, it is considered acceptable [ 8 ]. Saunders et al. [ 2 ] argued that the questionnaire with a scale response of 20% response rate is acceptable. Low response rate should not discourage the researchers, because a great deal of published research work also achieves low response rate. Hence, the response rate of this study is acceptable and very good for the purpose of meeting the study objectives.

5.1.6 Data collection tool pretest

The pretest for questionnaires, interviews, and tools were conducted to validate that the tool content is valid or not in the sense of the respondents’ understanding. Hence, content validity (in which the questions are answered to the target without excluding important points), internal validity (in which the questions raised answer the outcomes of researchers’ target), and external validity (in which the result can generalize to all the population from the survey sample population) were reflected. It has been proved with this pilot test prior to the start of the basic data collections. Following feedback process, a few minor changes were made to the originally designed data collect tools. The pilot test made for the questionnaire test was on 10 sample sizes selected randomly from the target sectors and experts.

5.2 Secondary data collection methods

The secondary data refers to data that was collected by someone other than the user. This data source gives insights of the research area of the current state-of-the-art method. It also makes some sort of research gap that needs to be filled by the researcher. This secondary data sources could be internal and external data sources of information that may cover a wide range of areas.

Literature/desk review and industry documents and reports: To achieve the dissertation’s objectives, the researcher has conducted excessive document review and reports of the companies in both online and offline modes. From a methodological point of view, literature reviews can be comprehended as content analysis, where quantitative and qualitative aspects are mixed to assess structural (descriptive) as well as content criteria.

A literature search was conducted using the database sources like MEDLINE; Emerald; Taylor and Francis publications; EMBASE (medical literature); PsycINFO (psychological literature); Sociological Abstracts (sociological literature); accident prevention journals; US Statistics of Labor, European Safety and Health database; ABI Inform; Business Source Premier (business/management literature); EconLit (economic literature); Social Service Abstracts (social work and social service literature); and other related materials. The search strategy was focused on articles or reports that measure one or more of the dimensions within the research OSH model framework. This search strategy was based on a framework and measurement filter strategy developed by the Consensus-Based Standards for the Selection of Health Measurement Instruments (COSMIN) group. Based on screening, unrelated articles to the research model and objectives were excluded. Prior to screening, researcher (principal investigator) reviewed a sample of more than 2000 articles, websites, reports, and guidelines to determine whether they should be included for further review or reject. Discrepancies were thoroughly identified and resolved before the review of the main group of more than 300 articles commenced. After excluding the articles based on the title, keywords, and abstract, the remaining articles were reviewed in detail, and the information was extracted on the instrument that was used to assess the dimension of research interest. A complete list of items was then collated within each research targets or objectives and reviewed to identify any missing elements.

6. Methods of data analysis

Data analysis method follows the procedures listed under the following sections. The data analysis part answered the basic questions raised in the problem statement. The detailed analysis of the developed and developing countries’ experiences on OSH regarding manufacturing industries was analyzed, discussed, compared and contrasted, and synthesized.

6.1 Quantitative data analysis

Quantitative data were obtained from primary and secondary data discussed above in this chapter. This data analysis was based on their data type using Excel, SPSS 20.0, Office Word format, and other tools. This data analysis focuses on numerical/quantitative data analysis.

Before analysis, data coding of responses and analysis were made. In order to analyze the data obtained easily, the data were coded to SPSS 20.0 software as the data obtained from questionnaires. This task involved identifying, classifying, and assigning a numeric or character symbol to data, which was done in only one way pre-coded [ 9 , 10 ]. In this study, all of the responses were pre-coded. They were taken from the list of responses, a number of corresponding to a particular selection was given. This process was applied to every earlier question that needed this treatment. Upon completion, the data were then entered to a statistical analysis software package, SPSS version 20.0 on Windows 10 for the next steps.

Under the data analysis, exploration of data has been made with descriptive statistics and graphical analysis. The analysis included exploring the relationship between variables and comparing groups how they affect each other. This has been done using cross tabulation/chi square, correlation, and factor analysis and using nonparametric statistic.

6.2 Qualitative data analysis

Qualitative data analysis used for triangulation of the quantitative data analysis. The interview, observation, and report records were used to support the findings. The analysis has been incorporated with the quantitative discussion results in the data analysis parts.

6.3 Data analysis software

The data were entered using SPSS 20.0 on Windows 10 and analyzed. The analysis supported with SPSS software much contributed to the finding. It had contributed to the data validation and correctness of the SPSS results. The software analyzed and compared the results of different variables used in the research questionnaires. Excel is also used to draw the pictures and calculate some analytical solutions.

7. The reliability and validity analysis of the quantitative data

7.1 reliability of data.

The reliability of measurements specifies the amount to which it is without bias (error free) and hence ensures consistent measurement across time and across the various items in the instrument [ 8 ]. In reliability analysis, it has been checked for the stability and consistency of the data. In the case of reliability analysis, the researcher checked the accuracy and precision of the procedure of measurement. Reliability has numerous definitions and approaches, but in several environments, the concept comes to be consistent [ 8 ]. The measurement fulfills the requirements of reliability when it produces consistent results during data analysis procedure. The reliability is determined through Cranach’s alpha as shown in Table 2 .

research design and methodology definition

Internal consistency and reliability test of questionnaires items.

K stands for knowledge; M, management; T, technology; C, collaboration; P, policy, standards, and regulation; H, hazards and accident conditions; PPE, personal protective equipment.

7.2 Reliability analysis

Cronbach’s alpha is a measure of internal consistency, i.e., how closely related a set of items are as a group [ 11 ]. It is considered to be a measure of scale reliability. The reliability of internal consistency most of the time is measured based on the Cronbach’s alpha value. Reliability coefficient of 0.70 and above is considered “acceptable” in most research situations [ 12 ]. In this study, reliability analysis for internal consistency of Likert-scale measurement after deleting 13 items was found similar; the reliability coefficients were found for 76 items were 0.964 and for the individual groupings made shown in Table 2 . It was also found internally consistent using the Cronbach’s alpha test. Table 2 shows the internal consistency of the seven major instruments in which their reliability falls in the acceptable range for this research.

7.3 Validity

Face validity used as defined by Babbie [ 13 ] is an indicator that makes it seem a reasonable measure of some variables, and it is the subjective judgment that the instrument measures what it intends to measure in terms of relevance [ 14 ]. Thus, the researcher ensured, in this study, when developing the instruments that uncertainties were eliminated by using appropriate words and concepts in order to enhance clarity and general suitability [ 14 ]. Furthermore, the researcher submitted the instruments to the research supervisor and the joint supervisor who are both occupational health experts, to ensure validity of the measuring instruments and determine whether the instruments could be considered valid on face value.

In this study, the researcher was guided by reviewed literature related to compliance with the occupational health and safety conditions and data collection methods before he could develop the measuring instruments. In addition, the pretest study that was conducted prior to the main study assisted the researcher to avoid uncertainties of the contents in the data collection measuring instruments. A thorough inspection of the measuring instruments by the statistician and the researcher’s supervisor and joint experts, to ensure that all concepts pertaining to the study were included, ensured that the instruments were enriched.

8. Data quality management

Insight has been given to the data collectors on how to approach companies, and many of the questionnaires were distributed through MSc students at Addis Ababa Institute of Technology (AAiT) and manufacturing industries’ experience experts. This made the data quality reliable as it has been continually discussed with them. Pretesting for questionnaire was done on 10 workers to assure the quality of the data and for improvement of data collection tools. Supervision during data collection was done to understand how the data collectors are handling the questionnaire, and each filled questionnaires was checked for its completeness, accuracy, clarity, and consistency on a daily basis either face-to-face or by phone/email. The data expected in poor quality were rejected out of the acting during the screening time. Among planned 267 questionnaires, 189 were responded back. Finally, it was analyzed by the principal investigator.

9. Inclusion criteria

The data were collected from the company representative with the knowledge of OSH. Articles written in English and Amharic were included in this study. Database information obtained in relation to articles and those who have OSH area such as interventions method, method of accident identification, impact of occupational accidents, types of occupational injuries/disease, and impact of occupational accidents, and disease on productivity and costs of company and have used at least one form of feedback mechanism. No specific time period was chosen in order to access all available published papers. The questionnaire statements which are similar in the questionnaire have been rejected from the data analysis.

10. Ethical consideration

Ethical clearance was obtained from the School of Mechanical and Industrial Engineering, Institute of Technology, Addis Ababa University. Official letters were written from the School of Mechanical and Industrial Engineering to the respective manufacturing industries. The purpose of the study was explained to the study subjects. The study subjects were told that the information they provided was kept confidential and that their identities would not be revealed in association with the information they provided. Informed consent was secured from each participant. For bad working environment assessment findings, feedback will be given to all manufacturing industries involved in the study. There is a plan to give a copy of the result to the respective study manufacturing industries’ and ministries’ offices. The respondents’ privacy and their responses were not individually analyzed and included in the report.

11. Dissemination and utilization of the result

The result of this study will be presented to the Addis Ababa University, AAiT, School of Mechanical and Industrial Engineering. It will also be communicated to the Ethiopian manufacturing industries, Ministry of Labor and Social Affair, Ministry of Industry, and Ministry of Health from where the data was collected. The result will also be availed by publication and online presentation in Google Scholars. To this end, about five articles were published and disseminated to the whole world.

12. Conclusion

The research methodology and design indicated overall process of the flow of the research for the given study. The data sources and data collection methods were used. The overall research strategies and framework are indicated in this research process from problem formulation to problem validation including all the parameters. It has laid some foundation and how research methodology is devised and framed for researchers. This means, it helps researchers to consider it as one of the samples and models for the research data collection and process from the beginning of the problem statement to the research finding. Especially, this research flow helps new researchers to the research environment and methodology in particular.

Conflict of interest

There is no “conflict of interest.”

  • 1. Aaker A, Kumar VD, George S. Marketing Research. New York: John Wiley & Sons Inc; 2000
  • 2. Saunders M, Lewis P, Thornhill A. Research Methods for Business Student. 5th ed. Edinburgh Gate: Pearson Education Limited; 2009
  • 3. Miller P. Motivation in the Workplace. Work and Organizational Psychology. Oxford: Blackwell Publishers; 1991
  • 4. Fraenkel FJ, Warren NE. How to Design and Evaluate Research in Education. 4th ed. New York: McGraw-Hill; 2002
  • 5. Danniel WW. Biostatist: A Foundation for Analysis in the Health Science. 7th ed. New York: John Wiley & Sons; 1999
  • 6. Cochran WG. Sampling Techniques. 3rd ed. New York: John Wiley & Sons; 1977
  • 7. Saaty TL. The Analytical Hierarchy Process. Pittsburg: PWS Publications; 1990
  • 8. Sekaran U, Bougie R. Research Methods for Business: A Skill Building Approach. 5th ed. New Delhi: John Wiley & Sons, Ltd; 2010. pp. 1-468
  • 9. Luck DJ, Rubin RS. Marketing Research. 7th ed. New Jersey: Prentice-Hall International; 1987
  • 10. Wong TC. Marketing Research. Oxford, UK: Butterworth-Heinemann; 1999
  • 11. Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. 1951; 16 :297-334
  • 12. Tavakol M, Dennick R. Making sense of Cronbach’s alpha. International Journal of Medical Education. 2011; 2 :53-55. DOI: 10.5116/ijme.4dfb.8dfd
  • 13. Babbie E. The Practice of Social Research. 12th ed. Belmont, CA: Wadsworth; 2010
  • 14. Polit DF, Beck CT. Generating and Assessing Evidence for Nursing Practice. 8th ed. Williams and Wilkins: Lippincott; 2008

© 2019 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Continue reading from the same book

Edited by Evon Abu-Taieh

Published: 17 June 2020

By Sabína Gáliková Tolnaiová and Slavomír Gálik

1008 downloads

By Carlos Pedro Gonçalves

1551 downloads

By Konstantinos-George Thanos, Andrianna Polydouri, A...

1051 downloads

Get science-backed answers as you write with Paperpal's Research feature

What is Research Methodology? Definition, Types, and Examples

research design and methodology definition

Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of the research. Several aspects must be considered before selecting an appropriate research methodology, such as research limitations and ethical concerns that may affect your research.

The research methodology section in a scientific paper describes the different methodological choices made, such as the data collection and analysis methods, and why these choices were selected. The reasons should explain why the methods chosen are the most appropriate to answer the research question. A good research methodology also helps ensure the reliability and validity of the research findings. There are three types of research methodology—quantitative, qualitative, and mixed-method, which can be chosen based on the research objectives.

What is research methodology ?

A research methodology describes the techniques and procedures used to identify and analyze information regarding a specific research topic. It is a process by which researchers design their study so that they can achieve their objectives using the selected research instruments. It includes all the important aspects of research, including research design, data collection methods, data analysis methods, and the overall framework within which the research is conducted. While these points can help you understand what is research methodology, you also need to know why it is important to pick the right methodology.

Why is research methodology important?

Having a good research methodology in place has the following advantages: 3

  • Helps other researchers who may want to replicate your research; the explanations will be of benefit to them.
  • You can easily answer any questions about your research if they arise at a later stage.
  • A research methodology provides a framework and guidelines for researchers to clearly define research questions, hypotheses, and objectives.
  • It helps researchers identify the most appropriate research design, sampling technique, and data collection and analysis methods.
  • A sound research methodology helps researchers ensure that their findings are valid and reliable and free from biases and errors.
  • It also helps ensure that ethical guidelines are followed while conducting research.
  • A good research methodology helps researchers in planning their research efficiently, by ensuring optimum usage of their time and resources.

Writing the methods section of a research paper? Let Paperpal help you achieve perfection

Types of research methodology.

There are three types of research methodology based on the type of research and the data required. 1

  • Quantitative research methodology focuses on measuring and testing numerical data. This approach is good for reaching a large number of people in a short amount of time. This type of research helps in testing the causal relationships between variables, making predictions, and generalizing results to wider populations.
  • Qualitative research methodology examines the opinions, behaviors, and experiences of people. It collects and analyzes words and textual data. This research methodology requires fewer participants but is still more time consuming because the time spent per participant is quite large. This method is used in exploratory research where the research problem being investigated is not clearly defined.
  • Mixed-method research methodology uses the characteristics of both quantitative and qualitative research methodologies in the same study. This method allows researchers to validate their findings, verify if the results observed using both methods are complementary, and explain any unexpected results obtained from one method by using the other method.

What are the types of sampling designs in research methodology?

Sampling 4 is an important part of a research methodology and involves selecting a representative sample of the population to conduct the study, making statistical inferences about them, and estimating the characteristics of the whole population based on these inferences. There are two types of sampling designs in research methodology—probability and nonprobability.

  • Probability sampling

In this type of sampling design, a sample is chosen from a larger population using some form of random selection, that is, every member of the population has an equal chance of being selected. The different types of probability sampling are:

  • Systematic —sample members are chosen at regular intervals. It requires selecting a starting point for the sample and sample size determination that can be repeated at regular intervals. This type of sampling method has a predefined range; hence, it is the least time consuming.
  • Stratified —researchers divide the population into smaller groups that don’t overlap but represent the entire population. While sampling, these groups can be organized, and then a sample can be drawn from each group separately.
  • Cluster —the population is divided into clusters based on demographic parameters like age, sex, location, etc.
  • Convenience —selects participants who are most easily accessible to researchers due to geographical proximity, availability at a particular time, etc.
  • Purposive —participants are selected at the researcher’s discretion. Researchers consider the purpose of the study and the understanding of the target audience.
  • Snowball —already selected participants use their social networks to refer the researcher to other potential participants.
  • Quota —while designing the study, the researchers decide how many people with which characteristics to include as participants. The characteristics help in choosing people most likely to provide insights into the subject.

What are data collection methods?

During research, data are collected using various methods depending on the research methodology being followed and the research methods being undertaken. Both qualitative and quantitative research have different data collection methods, as listed below.

Qualitative research 5

  • One-on-one interviews: Helps the interviewers understand a respondent’s subjective opinion and experience pertaining to a specific topic or event
  • Document study/literature review/record keeping: Researchers’ review of already existing written materials such as archives, annual reports, research articles, guidelines, policy documents, etc.
  • Focus groups: Constructive discussions that usually include a small sample of about 6-10 people and a moderator, to understand the participants’ opinion on a given topic.
  • Qualitative observation : Researchers collect data using their five senses (sight, smell, touch, taste, and hearing).

Quantitative research 6

  • Sampling: The most common type is probability sampling.
  • Interviews: Commonly telephonic or done in-person.
  • Observations: Structured observations are most commonly used in quantitative research. In this method, researchers make observations about specific behaviors of individuals in a structured setting.
  • Document review: Reviewing existing research or documents to collect evidence for supporting the research.
  • Surveys and questionnaires. Surveys can be administered both online and offline depending on the requirement and sample size.

Let Paperpal help you write the perfect research methods section. Start now!

What are data analysis methods.

The data collected using the various methods for qualitative and quantitative research need to be analyzed to generate meaningful conclusions. These data analysis methods 7 also differ between quantitative and qualitative research.

Quantitative research involves a deductive method for data analysis where hypotheses are developed at the beginning of the research and precise measurement is required. The methods include statistical analysis applications to analyze numerical data and are grouped into two categories—descriptive and inferential.

Descriptive analysis is used to describe the basic features of different types of data to present it in a way that ensures the patterns become meaningful. The different types of descriptive analysis methods are:

  • Measures of frequency (count, percent, frequency)
  • Measures of central tendency (mean, median, mode)
  • Measures of dispersion or variation (range, variance, standard deviation)
  • Measure of position (percentile ranks, quartile ranks)

Inferential analysis is used to make predictions about a larger population based on the analysis of the data collected from a smaller population. This analysis is used to study the relationships between different variables. Some commonly used inferential data analysis methods are:

  • Correlation: To understand the relationship between two or more variables.
  • Cross-tabulation: Analyze the relationship between multiple variables.
  • Regression analysis: Study the impact of independent variables on the dependent variable.
  • Frequency tables: To understand the frequency of data.
  • Analysis of variance: To test the degree to which two or more variables differ in an experiment.

Qualitative research involves an inductive method for data analysis where hypotheses are developed after data collection. The methods include:

  • Content analysis: For analyzing documented information from text and images by determining the presence of certain words or concepts in texts.
  • Narrative analysis: For analyzing content obtained from sources such as interviews, field observations, and surveys. The stories and opinions shared by people are used to answer research questions.
  • Discourse analysis: For analyzing interactions with people considering the social context, that is, the lifestyle and environment, under which the interaction occurs.
  • Grounded theory: Involves hypothesis creation by data collection and analysis to explain why a phenomenon occurred.
  • Thematic analysis: To identify important themes or patterns in data and use these to address an issue.

How to choose a research methodology?

Here are some important factors to consider when choosing a research methodology: 8

  • Research objectives, aims, and questions —these would help structure the research design.
  • Review existing literature to identify any gaps in knowledge.
  • Check the statistical requirements —if data-driven or statistical results are needed then quantitative research is the best. If the research questions can be answered based on people’s opinions and perceptions, then qualitative research is most suitable.
  • Sample size —sample size can often determine the feasibility of a research methodology. For a large sample, less effort- and time-intensive methods are appropriate.
  • Constraints —constraints of time, geography, and resources can help define the appropriate methodology.

Got writer’s block? Kickstart your research paper writing with Paperpal now!

How to write a research methodology .

A research methodology should include the following components: 3,9

  • Research design —should be selected based on the research question and the data required. Common research designs include experimental, quasi-experimental, correlational, descriptive, and exploratory.
  • Research method —this can be quantitative, qualitative, or mixed-method.
  • Reason for selecting a specific methodology —explain why this methodology is the most suitable to answer your research problem.
  • Research instruments —explain the research instruments you plan to use, mainly referring to the data collection methods such as interviews, surveys, etc. Here as well, a reason should be mentioned for selecting the particular instrument.
  • Sampling —this involves selecting a representative subset of the population being studied.
  • Data collection —involves gathering data using several data collection methods, such as surveys, interviews, etc.
  • Data analysis —describe the data analysis methods you will use once you’ve collected the data.
  • Research limitations —mention any limitations you foresee while conducting your research.
  • Validity and reliability —validity helps identify the accuracy and truthfulness of the findings; reliability refers to the consistency and stability of the results over time and across different conditions.
  • Ethical considerations —research should be conducted ethically. The considerations include obtaining consent from participants, maintaining confidentiality, and addressing conflicts of interest.

Streamline Your Research Paper Writing Process with Paperpal

The methods section is a critical part of the research papers, allowing researchers to use this to understand your findings and replicate your work when pursuing their own research. However, it is usually also the most difficult section to write. This is where Paperpal can help you overcome the writer’s block and create the first draft in minutes with Paperpal Copilot, its secure generative AI feature suite.  

With Paperpal you can get research advice, write and refine your work, rephrase and verify the writing, and ensure submission readiness, all in one place. Here’s how you can use Paperpal to develop the first draft of your methods section.  

  • Generate an outline: Input some details about your research to instantly generate an outline for your methods section 
  • Develop the section: Use the outline and suggested sentence templates to expand your ideas and develop the first draft.  
  • P araph ras e and trim : Get clear, concise academic text with paraphrasing that conveys your work effectively and word reduction to fix redundancies. 
  • Choose the right words: Enhance text by choosing contextual synonyms based on how the words have been used in previously published work.  
  • Check and verify text : Make sure the generated text showcases your methods correctly, has all the right citations, and is original and authentic. .   

You can repeat this process to develop each section of your research manuscript, including the title, abstract and keywords. Ready to write your research papers faster, better, and without the stress? Sign up for Paperpal and start writing today!

Frequently Asked Questions

Q1. What are the key components of research methodology?

A1. A good research methodology has the following key components:

  • Research design
  • Data collection procedures
  • Data analysis methods
  • Ethical considerations

Q2. Why is ethical consideration important in research methodology?

A2. Ethical consideration is important in research methodology to ensure the readers of the reliability and validity of the study. Researchers must clearly mention the ethical norms and standards followed during the conduct of the research and also mention if the research has been cleared by any institutional board. The following 10 points are the important principles related to ethical considerations: 10

  • Participants should not be subjected to harm.
  • Respect for the dignity of participants should be prioritized.
  • Full consent should be obtained from participants before the study.
  • Participants’ privacy should be ensured.
  • Confidentiality of the research data should be ensured.
  • Anonymity of individuals and organizations participating in the research should be maintained.
  • The aims and objectives of the research should not be exaggerated.
  • Affiliations, sources of funding, and any possible conflicts of interest should be declared.
  • Communication in relation to the research should be honest and transparent.
  • Misleading information and biased representation of primary data findings should be avoided.

Q3. What is the difference between methodology and method?

A3. Research methodology is different from a research method, although both terms are often confused. Research methods are the tools used to gather data, while the research methodology provides a framework for how research is planned, conducted, and analyzed. The latter guides researchers in making decisions about the most appropriate methods for their research. Research methods refer to the specific techniques, procedures, and tools used by researchers to collect, analyze, and interpret data, for instance surveys, questionnaires, interviews, etc.

Research methodology is, thus, an integral part of a research study. It helps ensure that you stay on track to meet your research objectives and answer your research questions using the most appropriate data collection and analysis tools based on your research design.

Accelerate your research paper writing with Paperpal. Try for free now!

  • Research methodologies. Pfeiffer Library website. Accessed August 15, 2023. https://library.tiffin.edu/researchmethodologies/whatareresearchmethodologies
  • Types of research methodology. Eduvoice website. Accessed August 16, 2023. https://eduvoice.in/types-research-methodology/
  • The basics of research methodology: A key to quality research. Voxco. Accessed August 16, 2023. https://www.voxco.com/blog/what-is-research-methodology/
  • Sampling methods: Types with examples. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/types-of-sampling-for-social-research/
  • What is qualitative research? Methods, types, approaches, examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-qualitative-research-methods-types-examples/
  • What is quantitative research? Definition, methods, types, and examples. Researcher.Life blog. Accessed August 15, 2023. https://researcher.life/blog/article/what-is-quantitative-research-types-and-examples/
  • Data analysis in research: Types & methods. QuestionPro website. Accessed August 16, 2023. https://www.questionpro.com/blog/data-analysis-in-research/#Data_analysis_in_qualitative_research
  • Factors to consider while choosing the right research methodology. PhD Monster website. Accessed August 17, 2023. https://www.phdmonster.com/factors-to-consider-while-choosing-the-right-research-methodology/
  • What is research methodology? Research and writing guides. Accessed August 14, 2023. https://paperpile.com/g/what-is-research-methodology/
  • Ethical considerations. Business research methodology website. Accessed August 17, 2023. https://research-methodology.net/research-methodology/ethical-considerations/

Paperpal is a comprehensive AI writing toolkit that helps students and researchers achieve 2x the writing in half the time. It leverages 21+ years of STM experience and insights from millions of research articles to provide in-depth academic writing, language editing, and submission readiness support to help you write better, faster.  

Get accurate academic translations, rewriting support, grammar checks, vocabulary suggestions, and generative AI assistance that delivers human precision at machine speed. Try for free or upgrade to Paperpal Prime starting at US$19 a month to access premium features, including consistency, plagiarism, and 30+ submission readiness checks to help you succeed.  

Experience the future of academic writing – Sign up to Paperpal and start writing for free!  

Related Reads:

  • Dangling Modifiers and How to Avoid Them in Your Writing 
  • Webinar: How to Use Generative AI Tools Ethically in Your Academic Writing
  • Research Outlines: How to Write An Introduction Section in Minutes with Paperpal Copilot
  • How to Paraphrase Research Papers Effectively

Language and Grammar Rules for Academic Writing

Climatic vs. climactic: difference and examples, you may also like, what is academic writing: tips for students, what is hedging in academic writing  , how to use ai to enhance your college..., how to use paperpal to generate emails &..., ai in education: it’s time to change the..., is it ethical to use ai-generated abstracts without..., do plagiarism checkers detect ai content, word choice problems: how to use the right..., how to avoid plagiarism when using generative ai..., what are journal guidelines on using generative ai....

Research Design and Methodology

  • First Online: 22 July 2020

Cite this chapter

research design and methodology definition

  • Marc Sniukas 2  

Part of the book series: Contributions to Management Science ((MANAGEMENT SC.))

1125 Accesses

2 Citations

According to Bryman and Bell (2007), the continuum of ontological positions ranges from objectivism on one end to constructionism on the other, while epistemological positions can range from positivism to interpretivism, whereas Easterby-Smith et al. (2012) distinguish between realist, internal realist, relativist and nominalist ontologies and positivist and social constructionist epistemologies. If the term “constructionism” is used to denote an epistemology opposite of positivism, it expresses both the relation to the social world and the knowledge of this world (Bryman and Bell 2007).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
  • Durable hardcover edition

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

http://www.qsrinternational.com/products_nvivo.aspx

http://www.evernote.com

Ambrosini, V., & Bowman, C. (2009). What are dynamic capabilities and are they a useful construct in strategic management? International Journal of Management Reviews, 11 (1), 29–49.

Google Scholar  

Barreto, I. (2010). Dynamic capabilities: A review of past research and an agenda for the future. Journal of Management, 36 (1), 256–280.

Birks, D., Fenrnandez, W., Levina, N., & Nasirin, S. (2013). Grounded theory method in information systems research: Its nature, diversity and opportunities. European Journal of Information Systems, 22 , 1–8.

Bryman, A., & Bell, E. (2007). Business research methods . Oxford: Oxford University Press.

Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis [Online]. Sage. Amazon Kindle eBook. Accessed January 2013, from Amazon.de

Corbin, J., & Strauss, A. (2008). Basics of qualitative research (3rd ed.). London: Sage.

Corley, K. G., & Gioia, D. A. (2004). Identity ambiguity and change in the wake of a corporate spin-off. Administrative Science Quarterly, 49 (2), 173–208.

Danneels, E. (2002). The dynamics of product innovation and firm competences. Strategic Management Journal, 23 (12), 1095–1121.

Dey, I. (2005). Qualitative data analysis: A user-friendly guide for social scientists . London: Routledge.

Easterby-Smith, M., Thorpe, R., & Jackson, P. (2012). Management research [Online]. Sage. Amazon Kindle eBook. Accessed October 2013, from Amazon.de

Egan, T. M. (2002). Grounded theory research and theory building. Advanced in Developing Human Resources, 4 (3), 277–295.

Eisenhardt, K. M. (1989). Building theories from case study research. The Academy of Management Review, 14 (4), 532–532.

Eisenhardt, K. M., & Martin, J. A. (2000). Dynamic capabilities: What are they? Strategic Management Journal, 21 (10/11), 1105–1121.

Gasson, S., & Waters, J. (2013). Using a grounded theory approach to study online collaboration behaviors. European Journal of Information Systems, 22 , 95–118.

Gersick, C. J. G. (1994). Pacing strategic change: The case of a new venture. The Academy of Management Journal, 37 (1), 9–45.

Gioia, D., & Chittipeddi, K. (1991). Sensemaking and sensegiving in strategic change initiation. Strategic Management Journal, 12 (6), 433–448.

Gioia, D. A., Corley, K. G., & Hamilton, A. L. (2013). Seeking qualitative rigor in inductive research: Notes on the Gioia methodology. Organizational Research Methods, 16 (1), 15–31.

Girod-Séville, M., & Perret, V. (2001). Espistemological foundations. In R.-A. Thiéart (Ed.), Doing management research: A comprehensive guide . London: Sage.

Glaser, B. (1992). Emergence vs. forcing: Basics of grounded theory analysis . Mill Valley, CA: Sociology Press.

Glaser, B., & Strauss, A. (1967). The discovery of grounded theory . Chicago: Aldine.

Goulding, C. (2009). Grounded theory perspectives in organizational research. In D. A. Buchanan & A. Bryman (Eds.), The SAGE handbook of organizational research methods . London: Sage.

Graebner, M. E., Martin, J. A., & Roundy, P. T. (2012). Qualitative data: Cooking without a recipe. Strategic Organization, 10 (3), 276–284.

Helfat, C. E., Finkelstein, S., Mitchell, W., Petraf, M. A., Singh, H., Teece, D. J., & Winter, S. G. (2007). Dynamic capabilities: Understanding strategic change in organizations [Online]. Malden, MA: Blackwell. Amazon Kindle eBook. Accessed January 2011, from Amazon.com

Langley, A. (1999). Strategies for theorizing from process data. Academy of Management Journal, 24 (4), 691–710.

Langley, A. (2007). Process thinking in strategic organization. Strategic Organization, 5 (3), 271–282.

Langley, A. (2009). Studying processes in and around organizations. In D. A. Buchanan & A. Bryman (Eds.), The SAGE handbook of organizational research methods . London: Sage.

Langley, A., & Truax, J. (1994). A process study of new technology adoption in smaller manufacturing firms. Journal of Management Studies, 31 (5), 619–652.

Lawson, B., & Samson, D. (2001). Developing innovation capability in organisations: A dynamic capabilities approach. International Journal of Innovation Management, 5 (3), 377–400.

Matavire, R., & Brown, I. (2011). Profiling grounded theory approaches in information systems research. European Journal of Information Systems, 22 (1), 119–129.

Miles, M. B., & Huberman, M. A. (1994). Qualitative data analysis (2nd ed.). London: Sage.

Mills, J., Bonner, A., & Francis, K. (2006). Adopting a constructivist approach to grounded theory: Implications for research design. International Journal of Nursing Practice, 12 (1), 8–13.

Orlikowski, W. J. (1993). CASE tools as organizational change: Investigating incremental and radical changes in systems development. MIS Quarterly, 17 (3), 309–340.

Partington, D. (2000). Building grounded theories of management action. British Journal of Management, 11 (2), 91–102.

Pettigrew, A. M. (1992) The character and significance of strategy process research. Strategic Management Journal, 13 (Special Issue: Fundamental Themes in Strategy Process Research), 5–16.

Pratt, M. G., Rockmann, K. W., & Kaufmann, J. B. (2006). Constructing professional identity: The role of work and identity learning cycles. The Academy of Management Journal, 49 (2), 235–262.

Royer, I., & Zarlowski, P. (2007). Research design. In R.-A. Thietart (Ed.), Doing management research: a comprehensive guide . London: Sage.

Salvato, C. (2003). The role of micro-strategies in the engineering of firm evolution. Journal of Management Studies, 40 (1), 83–108.

Shanley, M., & Peteraf, M. (2006). The centrality of process. International Journal of Strategic Change Management, 1 (1/2), 4–19.

Urquhart, C., Lehmann, H., & Myers, M. (2009). Putting the ‘theory’ back into grounded theory: Guidelines for grounded theory studies in information systems. Information Systems Journal, 20 , 357–381.

Van de Ven, A. H. (1992). Suggestions for studying strategy process: A research note. Strategic Management Journal, 13 (5), 169–188.

Wang, C. L., & Ahmed, P. K. (2007). Dynamic capabilities: A review and research agenda. International Journal of Management Reviews, 9 (1), 31–51.

Yin, R. K. (2009). Case study research: Design and methods (Vol. 5, 4th ed.). London: Sage.

Download references

Author information

Authors and affiliations.

Alliance Manchester Business School, Manchester, UK

Marc Sniukas

You can also search for this author in PubMed   Google Scholar

Rights and permissions

Reprints and permissions

Copyright information

© 2020 The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Sniukas, M. (2020). Research Design and Methodology. In: Business Model Innovation as a Dynamic Capability. Contributions to Management Science. Springer, Cham. https://doi.org/10.1007/978-3-030-50100-6_3

Download citation

DOI : https://doi.org/10.1007/978-3-030-50100-6_3

Published : 22 July 2020

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-50099-3

Online ISBN : 978-3-030-50100-6

eBook Packages : Business and Management Business and Management (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of springeropen

Research design: the methodology for interdisciplinary research framework

1 Biometris, Wageningen University and Research, PO Box 16, 6700 AA Wageningen, The Netherlands

Jarl K. Kampen

2 Statua, Dept. of Epidemiology and Medical Statistics, Antwerp University, Venusstraat 35, 2000 Antwerp, Belgium

Many of today’s global scientific challenges require the joint involvement of researchers from different disciplinary backgrounds (social sciences, environmental sciences, climatology, medicine, etc.). Such interdisciplinary research teams face many challenges resulting from differences in training and scientific culture. Interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences. For that purpose this paper presents the Methodology for Interdisciplinary Research (MIR) framework. The MIR framework was developed to help cross disciplinary borders, especially those between the natural sciences and the social sciences. The framework has been specifically constructed to facilitate the design of interdisciplinary scientific research, and can be applied in an educational program, as a reference for monitoring the phases of interdisciplinary research, and as a tool to design such research in a process approach. It is suitable for research projects of different sizes and levels of complexity, and it allows for a range of methods’ combinations (case study, mixed methods, etc.). The different phases of designing interdisciplinary research in the MIR framework are described and illustrated by real-life applications in teaching and research. We further discuss the framework’s utility in research design in landscape architecture, mixed methods research, and provide an outlook to the framework’s potential in inclusive interdisciplinary research, and last but not least, research integrity.

Introduction

Current challenges, e.g., energy, water, food security, one world health and urbanization, involve the interaction between humans and their environment. A (mono)disciplinary approach, be it a psychological, economical or technical one, is too limited to capture any one of these challenges. The study of the interaction between humans and their environment requires knowledge, ideas and research methodology from different disciplines (e.g., ecology or chemistry in the natural sciences, psychology or economy in the social sciences). So collaboration between natural and social sciences is called for (Walsh et al. 1975 ).

Over the past decades, different forms of collaboration have been distinguished although the terminology used is diverse and ambiguous. For the present paper, the term interdisciplinary research is used for (Aboelela et al. 2007 , p. 341):

any study or group of studies undertaken by scholars from two or more distinct scientific disciplines. The research is based upon a conceptual model that links or integrates theoretical frameworks from those disciplines, uses study design and methodology that is not limited to any one field, and requires the use of perspectives and skills of the involved disciplines throughout multiple phases of the research process.

Scientific disciplines (e.g., ecology, chemistry, biology, psychology, sociology, economy, philosophy, linguistics, etc.) are categorized into distinct scientific cultures: the natural sciences, the social sciences and the humanities (Kagan 2009 ). Interdisciplinary research may involve different disciplines within a single scientific culture, and it can also cross cultural boundaries as in the study of humans and their environment.

A systematic review of the literature on natural-social science collaboration (Fischer et al. 2011 ) confirmed the general impression of this collaboration to be a challenge. The nearly 100 papers in their analytic set mentioned more instances of barriers than of opportunities (72 and 46, respectively). Four critical factors for success or failure in natural-social science collaboration were identified: the paradigms or epistemologies in the current (mono-disciplinary) sciences, the skills and competences of the scientists involved, the institutional context of the research, and the organization of collaborations (Fischer et al. 2011 ). The so-called “paradigm war” between neopositivist versus constructivists within the social and behavioral sciences (Onwuegbuzie and Leech 2005 ) may complicate pragmatic collaboration further.

It has been argued that interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences (Frischknecht 2000 ) and accordingly, some interdisciplinary programs have been developed since (Baker and Little 2006 ; Spelt et al. 2009 ). The overall effect of interdisciplinary programs can be expected to be small as most programs are mono-disciplinary and based on a single paradigm (positivist-constructivist, qualitative-quantitative; see e.g., Onwuegbuzie and Leech 2005 ). We saw in our methodology teaching, consultancy and research practices working with heterogeneous groups of students and staff, that most had received mono-disciplinary training with a minority that had received multidisciplinary training, with few exceptions within the same paradigm. During our teaching and consultancy for heterogeneous groups of students and staff aimed at designing interdisciplinary research, we built the framework for methodology in interdisciplinary research (MIR). With the MIR framework, we aspire to contribute to the critical factors skills and competences (Fischer et al. 2011 ) for social and natural sciences collaboration. Note that the scale of interdisciplinary research projects we have in mind may vary from comparably modest ones (e.g., finding a link between noise reducing asphalt and quality of life; Vuye et al. 2016 ) to very large projects (finding a link between anthropogenic greenhouse gas emissions, climate change, and food security; IPCC 2015 ).

In the following section of this paper we describe the MIR framework and elaborate on its components. The third section gives two examples of the application of the MIR framework. The paper concludes with a discussion of the MIR framework in the broader contexts of mixed methods research, inclusive research, and other promising strains of research.

The methodology in interdisciplinary research framework

Research as a process in the methodology in interdisciplinary research framework.

The Methodology for Interdisciplinary Research (MIR) framework was built on the process approach (Kumar 1999 ), because in the process approach, the research question or hypothesis is leading for all decisions in the various stages of research. That means that it helps the MIR framework to put the common goal of the researchers at the center, instead of the diversity of their respective backgrounds. The MIR framework also introduces an agenda: the research team needs to carefully think through different parts of the design of their study before starting its execution (Fig.  1 ). First, the team discusses the conceptual design of their study which contains the ‘why’ and ‘what’ of the research. Second, the team discusses the technical design of the study which contains the ‘how’ of the research. Only after the team agrees that the complete research design is sufficiently crystalized, the execution of the work (including fieldwork) starts.

An external file that holds a picture, illustration, etc.
Object name is 11135_2017_513_Fig1_HTML.jpg

The Methodology of Interdisciplinary Research framework

Whereas the conceptual and technical designs are by definition interdisciplinary team work, the respective team members may do their (mono)disciplinary parts of fieldwork and data analysis on a modular basis (see Bruns et al. 2017 : p. 21). Finally, when all evidence is collected, an interdisciplinary synthesis of analyses follows which conclusions are input for the final report. This implies that the MIR framework allows for a range of scales of research projects, e.g., a mixed methods project and its smaller qualitative and quantitative modules, or a multi-national sustainability project and its national sociological, economic and ecological modules.

The conceptual design

Interdisciplinary research design starts with the “conceptual design” which addresses the ‘why’ and ‘what’ of a research project at a conceptual level to ascertain the common goals pivotal to interdisciplinary collaboration (Fischer et al. 2011 ). The conceptual design includes mostly activities such as thinking, exchanging interdisciplinary knowledge, reading and discussing. The product of the conceptual design is called the “conceptual frame work” which comprises of the research objective (what is to be achieved by the research), the theory or theories that are central in the research project, the research questions (what knowledge is to be produced), and the (partial) operationalization of constructs and concepts that will be measured or recorded during execution. While the members of the interdisciplinary team and the commissioner of the research must reach a consensus about the research objective, the ‘why’, the focus in research design must be the production of the knowledge required to achieve that objective the ‘what’.

With respect to the ‘why’ of a research project, an interdisciplinary team typically starts with a general aim as requested by the commissioner or funding agency, and a set of theories to formulate a research objective. This role of theory is not always obvious to students from the natural sciences, who tend to think in terms of ‘models’ with directly observable variables. On the other hand, students from the social sciences tend to think in theories with little attention to observable variables. In the MIR framework, models as simplified descriptions or explanations of what is studied in the natural sciences play the same role in informing research design, raising research questions, and informing how a concept is understood, as do theories in social science.

Research questions concern concepts, i.e. general notions or ideas based on theory or common sense that are multifaceted and not directly visible or measurable. For example, neither food security (with its many different facets) nor a person’s attitude towards food storage may be directly observed. The operationalization of concepts, the transformation of concepts into observable indicators, in interdisciplinary research requires multiple steps, each informed by theory. For instance, in line with particular theoretical frameworks, sustainability and food security may be seen as the composite of a social, an economic and an ecological dimension (e.g., Godfray et al. 2010 ).

As the concept of interest is multi-disciplinary and multi-dimensional, the interdisciplinary team will need to read, discuss and decide on how these dimensions and their indicators are weighted to measure the composite interdisciplinary concept to get the required interdisciplinary measurements. The resulting measure or measures for the interdisciplinary concept may be of the nominal, ordinal, interval and ratio level, or a combination thereof. This operationalization procedure is known as the port-folio approach to widely defined measurements (Tobi 2014 ). Only after the research team has finalized the operationalization of the concepts under study, the research questions and hypotheses can be made operational. For example, a module with descriptive research questions may now be turned into an operational one like, what are the means and variances of X1, X2, and X3 in a given population? A causal research question may take on the form, is X (a composite of X1, X2 and X3) a plausible cause for the presence or absence of Y? A typical qualitative module could study, how do people talk about X1, X2 and X3 in their everyday lives?

The technical design

Members of an interdisciplinary team usually have had different training with respect to research methods, which makes discussing and deciding on the technical design more challenging but also potentially more creative than in a mono-disciplinary team. The technical design addresses the issues ‘how, where and when will research units be studied’ (study design), ‘how will measurement proceed’ (instrument selection or design), ‘how and how many research units will be recruited’ (sampling plan), and ‘how will collected data be analyzed and synthesized’ (analysis plan). The MIR framework provides the team a set of topics and their relationships to one another and to generally accepted quality criteria (see Fig.  1 ), which helps in designing this part of the project.

Interdisciplinary teams need be pragmatic as the research questions agreed on are leading in decisions on the data collection set-up (e.g., a cross-sectional study of inhabitants of a region, a laboratory experiment, a cohort study, a case control study, etc.), the so-called “study design” (e.g., Kumar 2014 ; De Vaus 2001 ; Adler and Clark 2011 ; Tobi and van den Brink 2017 ) instead of traditional ‘pet’ approaches. Typical study designs for descriptive research questions and research questions on associations are the cross-sectional study design. Longitudinal study designs are required to investigate development over time and cause-effect relationships ideally are studied in experiments (e.g., Kumar 2014 ; Shipley 2016 ). Phenomenological questions concern a phenomenon about which little is known and which has to be studied in the environment where it takes place, which calls for a case study design (e.g., Adler and Clark 2011 : p. 178). For each module, the study design is to be further explicated by the number of data collection waves, the level of control by the researcher and its reference period (e.g., Kumar 2014 ) to ensure the teams common understanding.

Then, decisions about the way data is to be collected, e.g., by means of certified instruments, observation, interviews, questionnaires, queries on existing data bases, or a combination of these are to be made. It is especially important to discuss the role of the observer (researcher) as this is often a source of misunderstanding in interdisciplinary teams. In the sciences, the observer is usually considered a neutral outsider when reading a standardized measurement instrument (e.g., a pyranometer to measure incoming solar radiation). In contrast, in the social sciences, the observer may be (part of) the measurement instrument, for example in participant observation or when doing in-depth interviews. After all, in participant observation the researcher observes from a member’s perspective and influences what is observed owing to the researcher’s participation (Flick 2006 : p. 220). Similarly in interviews, by which we mean “a conversation that has a structure and a purpose determined by the one party—the interviewer” (Kvale 2007 : p. 7), the interviewer and the interviewee are part of the measurement instrument (Kvale and Brinkmann 2009 : p. 2). In on-line and mail questionnaires the interviewer is eliminated as part of the instrument by standardizing the questions and answer options. Queries on existing data bases refer to the use of secondary data or secondary analysis. Different disciplines tend to use different bibliographic data bases (e.g., CAB Abstracts, ABI/INFORM or ERIC) and different data repositories (e.g., the European Social Survey at europeansocialsurvey.org or the International Council for Science data repository hosted by www.pangaea.de ).

Depending on whether or not the available, existing, measurement instruments tally with the interdisciplinary operationalisations from the conceptual design, the research team may or may not need to design instruments. Note that in some cases the social scientists’ instinct may be to rely on a questionnaire whereas the collaboration with another discipline may result in more objective possibilities (e.g., compare asking people about what they do with surplus medication, versus measuring chemical components from their input into the sewer system). Instrument design may take on different forms, such as the design of a device (e.g., pyranometer), a questionnaire (Dillman 2007 ) or a part thereof (e.g., a scale see DeVellis 2012 ; Danner et al. 2016 ), an interview guide with topics or questions for the interviewees, or a data extraction form in the context of secondary analysis and literature review (e.g., the Cochrane Collaboration aiming at health and medical sciences or the Campbell Collaboration aiming at evidence based policies).

Researchers from different disciplines are inclined to think of different research objects (e.g., animals, humans or plots), which is where the (specific) research questions come in as these identify the (possibly different) research objects unambiguously. In general, research questions that aim at making an inventory, whether it is an inventory of biodiversity or of lodging, call for a random sampling design. Both in the biodiversity and lodging example, one may opt for random sampling of geographic areas by means of a list of coordinates. Studies that aim to explain a particular phenomenon in a particular context would call for a purposive sampling design (non-random selection). Because studies of biodiversity and housing obey the same laws in terms of appropriate sampling design for similar research questions, individual students and researchers are sensitized to commonalities of their respective (mono)disciplines. For example, a research team interested in the effects of landslides on a socio-ecological system may select for their study one village that suffered from landslides and one village that did not suffer from landslides that have other characteristics in common (e.g., kind of soil, land use, land property legislation, family structure, income distribution, et cetera).

The data analysis plan describes how data will be analysed, for each of the separate modules and for the project at large. In the context of a multi-disciplinary quantitative research project, the data analysis plan will list the intended uni-, bi- and multivariate analyses such as measures for distributions (e.g., means and variances), measures for association (e.g., Pearson Chi square or Kendall Tau) and data reduction and modelling techniques (e.g., factor analysis and multiple linear regression or structural equation modelling) for each of the research modules using the data collected. When applicable, it will describe interim analyses and follow-up rules. In addition to the plans at modular level, the data analysis plan must describe how the input from the separate modules, i.e. different analyses, will be synthesized to answer the overall research question. In case of mixed methods research, the particular type of mixed methods design chosen describes how, when, and to what extent the team will synthesize the results from the different modules.

Unfortunately, in our experience, when some of the research modules rely on a qualitative approach, teams tend to refrain from designing a data analysis plan before starting the field work. While absence of a data analysis plan may be regarded acceptable in fields that rely exclusively on qualitative research (e.g., ethnography), failure to communicate how data will be analysed and what potential evidence will be produced posits a deathblow to interdisciplinarity. For many researchers not familiar with qualitative research, the black box presented as “qualitative data analysis” is a big hurdle, and a transparent and systematic plan is a sine qua non for any scientific collaboration. The absence of a data analysis plan for all modules results in an absence of synthesis of perspectives and skills of the disciplines involved, and in separate (disciplinary) research papers or separate chapters in the research report without an answer to the overall research question. So, although researchers may find it hard to write the data analysis plan for qualitative data, it is pivotal in interdisciplinary research teams.

Similar to the quantitative data analysis plan, the qualitative data analysis plan presents the description of how the researcher will get acquainted with the data collected (e.g., by constructing a narrative summary per interviewee or a paired-comparison of essays). Additionally, the rules to decide on data saturation need be presented. Finally, the types of qualitative analyses are to be described in the data analysis plan. Because there is little or no standardized terminology in qualitative data analysis, it is important to include a precise description as well as references to the works that describe the method intended (e.g., domain analysis as described by Spradley 1979 ; or grounded theory by means of constant-comparison as described by Boeije 2009 ).

Integration

To benefit optimally from the research being interdisciplinary the modules need to be brought together in the integration stage. The modules may be mono- or interdisciplinary and may rely on quantitative, qualitative or mixed methods approaches. So the MIR framework fits the view that distinguishes three multimethods approaches (quali–quali, quanti–quanti, and quali–quant).

Although the MIR framework has not been designed with the intention to promote mixed methods research, it is suitable for the design of mixed methods research as the kind of research that calls for both quantitative and qualitative components (Creswell and Piano Clark 2011 ). Indeed, just like the pioneers in mixed methods research (Creswell and Piano Clark 2011 : p. 2), the MIR framework deconstructs the package deals of paradigm and data to be collected. The synthesis of the different mono or interdisciplinary modules may benefit from research done on “the unique challenges and possibilities of integration of qualitative and quantitative approaches” (Fetters and Molina-Azorin 2017 : p. 5). We distinguish (sub) sets of modules being designed as convergent, sequential or embedded (adapted from mixed methods design e.g., Creswell and Piano Clark 2011 : pp. 69–70). Convergent modules, whether mono or interdisciplinary, may be done parallel and are integrated after completion. Sequential modules are done after one another and the first modules inform the latter ones (this includes transformative and multiphase mixed methods design). Embedded modules are intertwined. Here, modules depend on one another for data collection and analysis, and synthesis may be planned both during and after completion of the embedded modules.

Scientific quality and ethical considerations in the design of interdisciplinary research

A minimum set of jargon related to the assessment of scientific quality of research (e.g., triangulation, validity, reliability, saturation, etc.) can be found scattered in Fig.  1 . Some terms are reserved by particular paradigms, others may be seen in several paradigms with more or less subtle differences in meaning. In the latter case, it is important that team members are prepared to explain and share ownership of the term and respect the different meanings. By paying explicit attention to the quality concepts, researchers from different disciplines learn to appreciate each other’s concerns for good quality research and recognize commonalities. For example, the team may discuss measurement validity of both a standardized quantitative instrument and that of an interview and discover that the calibration of the machine serves a similar purpose as the confirmation of the guarantee of anonymity at the start of an interview.

Throughout the process of research design, ethics require explicit discussion among all stakeholders in the project. Ethical issues run through all components in the MIR framework in Fig.  1 . Where social and medical scientists may be more sensitive to ethical issues related to humans (e.g., the 1979 Belmont Report criteria of beneficence, justice, and respect), others may be more sensitive to issues related to animal welfare, ecology, legislation, the funding agency (e.g., implications for policy), data and information sharing (e.g., open access publishing), sloppy research practices, or long term consequences of the research. This is why ethics are an issue for the entire interdisciplinary team and cannot be discussed on project module level only.

The MIR framework in practice: two examples

Teaching research methodology to heterogeneous groups of students, institutional context and background of the mir framework.

Wageningen University and Research (WUR) advocates in its teaching and research an interdisciplinary approach to the study of global issues related to the motto “To explore the potential of nature to improve the quality of life.” Wageningen University’s student population is multidisciplinary and international (e.g., Tobi and Kampen 2013 ). Traditionally, this challenge of diversity in one classroom is met by covering a width of methodological topics and examples from different disciplines. However, when students of various programmes received methodological education in mixed classes, students of some disciplines would regard with disinterest or even disdain methods and techniques of the other disciplines. Different disciplines, especially from the qualitative respectively quantitative tradition in the social sciences (Onwuegbuzie and Leech 2005 : p. 273), claim certain study designs, methods of data collection and analysis as their territory, a claim reflected in many textbooks. We found that students from a qualitative tradition would not be interested, and would not even study, content like the design of experiments and quantitative data collection; and students from a quantitative tradition would ignore case study design and qualitative data collection. These students assumed they didn’t need any knowledge about ‘the other tradition’ for their future careers, despite the call for interdisciplinarity.

To enhance interdisciplinarity, WUR provides an MSc course mandatory for most students, in which multi-disciplinary teams do research for a commissioner. Students reported difficulties similar to the ones found in the literature: miscommunication due to talking different scientific languages and feelings of distrust and disrespect due to prejudice. This suggested that research methodology courses ought help prepare for interdisciplinary collaboration by introducing a single methodological framework that 1) creates sensitivity to the pros and challenges of interdisciplinary research by means of a common vocabulary and fosters respect for other disciplines, 2) starts from the research questions as pivotal in decision making on research methods instead of tradition or ontology, and 3) allows available methodologies and methods to be potentially applicable to any scientific research problem.

Teaching with MIR—the conceptual framework

As a first step, we replaced textbooks by ones refusing the idea that any scientific tradition has exclusive ownership of any methodological approach or method. The MIR framework further guides our methodology teaching in two ways. First, it presents a logical sequence of topics (first conceptual design, then technical design; first research question(s) or hypotheses, then study design; etc.). Second, it allows for a conceptual separation of topics (e.g., study design from instrument design). Educational programmes at Wageningen University and Research consistently stress the vital importance of good research design. In fact, 50% of the mark in most BSc and MSc courses in research methodology is based on the assessment of a research proposal that students design in small (2-4 students) and heterogeneous (discipline, gender and nationality) groups. The research proposal must describe a project which can be executed in practice, and which limitations (measurement, internal, and external validity) are carefully discussed.

Groups start by selecting a general research topic. They discuss together previously attained courses from a range of programs to identify personal and group interests, with the aim to reach an initial research objective and a general research question as input for the conceptual design. Often, their initial research objective and research question are too broad to be researchable (e.g., Kumar 2014 : p. 64; Adler and Clark 2011 : p. 71). In plenary sessions, the (basics of) critical assessment of empirical research papers is taught with special attention to the ‘what’ and ‘why’ section of research papers. During tutorials students generate research questions until the group agrees on a research objective, with one general research question that consists of a small set of specific research questions. Each of the specific research questions may stem from a different discipline, whereas answering the general research question requires integrating the answers to all specific research questions.

The group then identifies the key concepts in their research questions, while exchanging thoughts on possible attributes based on what they have learnt from previous courses (theories) and literature. When doing so they may judge the research question as too broad, in which case they will turn to the question strategies toolbox again. Once they agree on the formulation of the research questions and the choice of concepts, tasks are divided. In general, each student turns to the literature he/she is most familiar with or interested in, for the operationalization of the concept into measurable attributes and writes a paragraph or two about it. In the next meeting, the groups read and discuss the input and decide on the set-up and division of tasks with respect to the technical design.

Teaching with MIR—the technical framework

The technical part of research design distinguishes between study design, instrument design, sampling design, and the data analysis plan. In class, we first present students with a range of study designs (cross sectional, experimental, etc.). Student groups select an appropriate study design by comparing the demands made by the research questions with criteria for internal validity. When a (specific) research question calls for a study design that is not seen as practically feasible or ethically possible, they will rephrase the research question until the demands of the research question tally with the characteristics of at least one ethical, feasible and internally valid study design.

While following plenary sessions during which different random and non-random sampling or selection strategies are taught, groups start working on their sampling design. The groups make two decisions informed by their research question: the population(s) of research units, and the requirements of the sampling strategy for each population. Like many other aspects in research design, this can be an iterative process. For example, suppose the research question mentioned “local policy makers,” which is too vague for a sampling design. Then the decision may be to limit the study to “policy makers at the municipality level in the Netherlands” and adapt the general and the specific research questions accordingly. Next, the group identifies whether a sample design needs to focus on diversity (e.g., when the objective is to make an inventory of possible local policies), representativeness (e.g., when the objective is to estimate prevalence of types of local policies), or people with particular information (e.g., when the objective is to study people having experience with a given local policy). When a sample has to representative, the students must produce an assessment of external validity, whereas when the aim is to map diversity the students must discuss possible ways of source triangulation. Finally, in conjunction with the data analysis plan, students decide on the sample size and/or the saturation criteria.

When the group has agreed on their population(s) and the strategy for recruiting research units, the next step is to finalize the technical aspects of operationalisation i.e. addressing the issue of exactly how information will be extracted from the research units. Depending on what is practically feasible qua measurement, the choice of a data collection instrument may be a standardised (e.g., a spectrograph, a questionnaire) or less standardised (e.g., semi-structured interviews, visual inspection) one. The students have to discuss the possibilities of method triangulation, and explain the possible weaknesses of their data collection plan in terms of measurement validity and reliability.

Recent developments

Presently little attention is payed to the data analysis plan, procedures for synthesis and reporting because the programmes differ on their offer in data analysis courses, and because execution of the research is not part of the BSc and MSc methodology courses. Recently, we have designed one course for an interdisciplinary BSc program in which the research question is put central in learning and deciding on statistics and qualitative data analysis. Nonetheless, during the past years the number of methodology courses for graduate students that supported the MIR framework have been expanded, e.g., a course “From Topic to Proposal”; separate training modules on questionnaire construction, interviewing, and observation; and optional courses on quantitative and qualitative data analysis. These courses are open to (and attended by) PhD students regardless of their program. In Flanders (Belgium), the Flemish Training Network for Statistics and Methodology (FLAMES) has for the last four years successfully applied the approach outlined in Fig.  1 in its courses for research design and data collection methods. The division of the research process in terms of a conceptual design, technical design, operationalisation, analysis plan, and sampling plan, has proved to be appealing for students of disciplines ranging from linguistics to bioengineering.

Researching with MIR: noise reducing asphalt layers and quality of life

Research objective and research question.

This example of the application of the MIR framework comes from a study about the effects of “noise reducing asphalt layers” on the quality of life (Vuye et al. 2016 ), a project commissioned by the City of Antwerp in 2015 and executed by a multidisciplinary research team of Antwerp University (Belgium). The principal researcher was an engineer from the Faculty of Applied Engineering (dept. Construction), supported by two researchers from the Faculty of Medicine and Health Sciences (dept. of Epidemiology and Social Statistics), one with a background in qualitative and one with a background in quantitative research methods. A number of meetings were held where the research team and the commissioners discussed the research objective (the ‘what’ and ‘why’).The research objective was in part dictated by the European Noise Directive 2002/49/EC, which forces all EU member states to draft noise action plans, and the challenge in this study was to produce evidence of a link between the acoustic and mechanical properties of different types of asphalt, and the quality of life of people living in the vicinity of the treated roads. While there was literature available about the effects of road surface on sound, and other studies had studied the link between noise and health, no study was found that produced evidence simultaneously about noise levels of roads and quality of life. The team therefore decided to test the hypothesis that traffic noise reduction has a beneficial effect on the quality of life of people into the central research. The general research question was, “to what extent does the placing of noise reducing asphalt layers increase the quality of life of the residents?”

Study design

In order to test the effect of types of asphalt, initially a pretest–posttest experiment was designed, which was expanded by several added experimental (change of road surface) and control (no change of road surface) groups. The research team gradually became aware that quality of life may not be instantly affected by lower noise levels, and that a time lag is involved. A second posttest aimed to follow up on this effect although it could only be implemented in a selection of experimental sites.

Instrument selection and design

Sound pressure levels were measured by an ISO-standardized procedure called the Statistical Pass-By (SPB) method. A detailed description of the method is in Vuye et al. ( 2016 ). No such objective procedure is available for measuring quality of life, which can only be assessed by self-reports of the residents. Some time was needed for the research team to accept that measuring a multidimensional concept like quality of life is more complicated than just having people rate their “quality of life” on a 10 point scale. For instance, questions had to be phrased in a way that gave not away the purpose of the research (Hawthorne effect), leading to the inclusion of questions about more nuisances than traffic noise alone. This led to the design of a self-administered questionnaire, with questions of Flanders Survey on Living Environment (Departement Leefmilieu, Natuur & Energie 2013 ) appended by new questions. Among other things, the questionnaire probed for experienced nuisance by sound, quality of sleep, effort to concentrate, effort to have a conversation inside or outside the home, physical complaints such as headaches, etc.

Sampling design

The selected sites needed to accommodate both types of measurements: that of noise from traffic and quality of life of residents. This was a complicating factor that required several rounds of deliberation. While countrywide only certain roads were available for changing the road surface, these roads had to be mutually comparable in terms of the composition of the population, type of residential area (e.g., reports from the top floor of a tall apartment building cannot be compared to those at ground level), average volume of traffic, vicinity of hospitals, railroads and airports, etc. At the level of roads therefore, targeted sampling was applied, whereas at the level of residents the aim was to realize a census of all households within a given perimeter from the treated road surfaces. Considerations about the reliability of applied instruments were guiding decisions with respect to sampling. While the measurements of the SPB method were sufficiently reliable to allow for relatively few measurements, the questionnaire suffered from considerable nonresponse which hampered statistical power. It was therefore decided to increase the power of the study by adding control groups in areas where the road surface was not replaced. This way, detecting an effect of the intervention did not solely depend on the turnout of the pre and the post-test.

Data analysis plan

The statistical analysis had to account for the fact that data were collected at two different levels: the level of the residents filling out the questionnaires, and the level of the roads which surface was changed. Because survey participation was confidential, results of the pre- and posttest could only be compared at aggregate (street) level. The analysis had to control for confounding variables (e.g., sample composition, variety in traffic volume, etc.), experimental factors (varieties in experimental conditions, and controls), and non-normal dependent variables. The statistical model appropriate for analysis of such data is a Generalised Linear Mixed Model.

Data were collected during the course of 2015, 2016 and 2017 and are awaiting final analysis in Spring 2017. Intermediate analyses resulted in several MSc theses, conference presentations, and working papers that reported on parts of the research.

In this paper we presented the Methodology in Interdisciplinary Research framework that we developed over the past decade building on our experience as lecturers, consultants and researchers. The MIR framework recognizes research methodology and methods as important content in the critical factor skills and competences. It approaches research and collaboration as a process that needs to be designed with the sole purpose to answer the general research question. For the conceptual design the team members have to discuss and agree on the objective of their communal efforts without squeezing it into one single discipline and, thus, ignoring complexity. The specific research questions, when formulated, contribute to (self) respect in collaboration as they represent and stand witness of the need for interdisciplinarity. In the technical design, different parts were distinguished to stimulate researchers to think and design research out of their respective disciplinary boxes and consider, for example, an experimental design with qualitative data collection, or a case study design based on quantitative information.

In our teaching and consultancy, we first developed a MIR framework for social sciences, economics, health and environmental sciences interdisciplinarity. It was challenged to include research in the design discipline of landscape architecture. What characterizes research in landscape architecture and other design principles, is that the design product as well as the design process may be the object of study. Lenzholder et al. ( 2017 ) therefore distinguish three kinds of research in landscape architecture. The first kind, “Research into design” studies the design product post hoc and the MIR framework suits the interdisciplinary study of such a product. In contrast, “Research for design” generates knowledge that feeds into the noun and the verb ‘design’, which means it precedes the design(ing). The third kind, Research through Design(ing) employs designing as a research method. At first, just like Deming and Swaffield ( 2011 ), we were a bit skeptical about “designing” as a research method. Lenzholder et al. ( 2017 ) pose that the meaning of research through design has evolved through a (neo)positivist, constructivist and transformative paradigm to include a pragmatic stance that resembles the pragmatic stance assumed in the MIR framework. We learned that, because landscape architecture is such an interdisciplinary field, the process approach and the distinction between a conceptual and technical research design was considered very helpful and embraced by researchers in landscape architecture (Tobi and van den Brink 2017 ).

Mixed methods research (MMR) has been considered to study topics as diverse as education (e.g., Powell et al. 2008 ), environmental management (e.g., Molina-Azorin and Lopez-Gamero 2016 ), health psychology (e.g., Bishop 2015 ) and information systems (e.g., Venkatesh et al. 2013 ). Nonetheless, the MIR framework is the first to put MMR in the context of integrating disciplines beyond social inquiry (Greene 2008 ). The splitting of the research into modules stimulates the identification and recognition of the contribution of both distinct and collaborating disciplines irrespective of whether they contribute qualitative and/or quantitative research in the interdisciplinary research design. As mentioned in Sect.  2.4 the integration of the different research modules in one interdisciplinary project design may follow one of the mixed methods designs. For example, we witnessed at several occasions the integration of social and health sciences in interdisciplinary teams opting for sequential modules in a sequential exploratory mixed methods fashion (e.g., Adamson 2005 : 234). In sustainability science research, we have seen the design of concurrent modules for a concurrent nested mixed methods strategy (ibid) in research integrating the social and natural sciences and economics.

The limitations of the MIR framework are those of any kind of collaboration: it cannot work wonders in the absence of awareness of the necessity and it requires the willingness to work, learn, and research together. We developed MIR framework in and alongside our own teaching, consultancy and research, it has not been formally evaluated and compared in an experiment with teaching, consultancy and research with, for example, the regulative cycle for problem solving (van Strien 1986 ), or the wheel of science from Babbie ( 2013 ). In fact, although we wrote “developed” in the previous sentence, we are fully aware of the need to further develop and refine the framework as is.

The importance of the MIR framework lies in the complex, multifaceted nature of issues like sustainability, food security and one world health. For progress in the study of these pressing issues the understanding, construction and quality of interdisciplinary portfolio measurements (Tobi 2014 ) are pivotal and require further study as well as procedures facilitating the integration across different disciplines.

Another important strain of further research relates to the continuum of Responsible Conduct of Research (RCR), Questionable Research Practices (QRP), and deliberate misconduct (Steneck 2006 ). QRP includes failing to report all of a study’s conditions, stopping collecting data earlier than planned because one found the result one had been looking for, etc. (e.g., John et al. 2012 ; Simmons et al. 2011 ; Kampen and Tamás 2014 ). A meta-analysis on selfreports obtained through surveys revealed that about 2% of researchers had admitted to research misconduct at least once, whereas up to 33% admitted to QRPs (Fanelli 2009 ). While the frequency of QRPs may easily eclipse that of deliberate fraud (John et al. 2012 ) these practices have received less attention than deliberate misconduct. Claimed research findings may often be accurate measures of the prevailing biases and methodological rigor in a research field (Fanelli and Ioannidis 2013 ; Fanelli 2010 ). If research misconduct and QRP are to be understood then the disciplinary context must be grasped as a locus of both legitimate and illegitimate activity (Fox 1990 ). It would be valuable to investigate how working in interdisciplinary teams and, consequently, exposure to other standards of QRP and RCR influence research integrity as the appropriate research behavior from the perspective of different professional standards (Steneck 2006 : p. 56). These differences in scientific cultures concern criteria for quality in design and execution of research, reporting (e.g., criteria for authorship of a paper, preferred publication outlets, citation practices, etc.), archiving and sharing of data, and so on.

Other strains of research include interdisciplinary collaboration and negotiation, where we expect contributions from the “science of team science” (Falk-Krzesinski et al. 2010 ); and compatibility of the MIR framework with new research paradigms such as “inclusive research” (a mode of research involving people with intellectual disabilities as more than just objects of research; e.g., Walmsley and Johnson 2003 ). Because of the complexity and novelty of inclusive health research a consensus statement was developed on how to conduct health research inclusively (Frankena et al., under review). The eight attributes of inclusive health research identified may also be taken as guiding attributes in the design of inclusive research according to the MIR framework. For starters, there is the possibility of inclusiveness in the conceptual framework, particularly in determining research objectives, and in discussing possible theoretical frameworks with team members with an intellectual disability which Frankena et al. labelled the “Designing the study” attribute. There are also opportunities for inclusiveness in the technical design, and in execution. For example, the inclusiveness attribute “generating data” overlaps with the operationalization and measurement instrument design/selection and the attribute “analyzing data” aligns with the data analysis plan in the technical design.

On a final note, we hope to have aroused the reader’s interest in, and to have demonstrated the need for, a methodology for interdisciplinary research design. We further hope that the MIR framework proposed and explained in this article helps those involved in designing an interdisciplinary research project to get a clearer view of the various processes that must be secured during the project’s design and execution. And we look forward to further collaboration with scientists from all cultures to contribute to improving the MIR framework and make interdisciplinary collaborations successful.

Acknowledgements

The MIR framework is the result of many discussions with students, researchers and colleagues, with special thanks to Peter Tamás, Jennifer Barrett, Loes Maas, Giel Dik, Ruud Zaalberg, Jurian Meijering, Vanessa Torres van Grinsven, Matthijs Brink, Gerda Casimir, and, last but not least, Jenneken Naaldenberg.

  • Aboelela SW, Larson E, Bakken S, Carrasquillo O, Formicola A, Glied SA, Gebbie KM. Defining interdisciplinary research: conclusions from a critical review of the literature. Health Serv. Res. 2007; 42 (1):329–346. doi: 10.1111/j.1475-6773.2006.00621.x. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Adamson J. Combined qualitative and quantitative designs. In: Bowling A, Ebrahim S, editors. Handbook of Health Research Methods: Investigation, Measurement and Analysis. Maidenhead: Open University Press; 2005. pp. 230–245. [ Google Scholar ]
  • Adler ES, Clark R. An Invitation to Social Research: How it’s Done. 4. London: Sage; 2011. [ Google Scholar ]
  • Babbie ER. The Practice of Social Research. 13. Belmont Ca: Wadsworth Cengage Learning; 2013. [ Google Scholar ]
  • Baker GH, Little RG. Enhancing homeland security: development of a course on critical infrastructure systems. J. Homel. Secur. Emerg. Manag. 2006 [ Google Scholar ]
  • Bishop FL. Using mixed methods research designs in health psychology: an illustrated discussion from a pragmatist perspective. Br. J. Health. Psychol. 2015; 20 (1):5–20. doi: 10.1111/bjhp.12122. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Boeije HR. Analysis in Qualitative Research. London: Sage; 2009. [ Google Scholar ]
  • Bruns D, van den Brink A, Tobi H, Bell S. Advancing landscape architecture research. In: van den Brink A, Bruns D, Tobi H, Bell S, editors. Research in Landscape Architecture: Methods And Methodology. New York: Routledge; 2017. pp. 11–23. [ Google Scholar ]
  • Creswell JW, Piano Clark VL. Designing and Conducting Mixed Methods Research. 2. Los Angeles: Sage; 2011. [ Google Scholar ]
  • Danner D, Blasius J, Breyer B, Eifler S, Menold N, Paulhus DL, Ziegler M. Current challenges, new developments, and future directions in scale construction. Eur. J. Psychol. Assess. 2016; 32 (3):175–180. doi: 10.1027/1015-5759/a000375. [ CrossRef ] [ Google Scholar ]
  • Deming ME, Swaffield S. Landscape Architecture Research. Hoboken: Wiley; 2011. [ Google Scholar ]
  • Departement Leefmilieu, Natuur en Energie: Uitvoeren van een uitgebreide schriftelijke enquête en een beperkte CAWI-enquête ter bepaling van het percentage gehinderden door geur, geluid en licht in Vlaanderen–SLO-3. Leuven: Market Analysis & Synthesis. www.lne.be/sites/default/files/atoms/files/lne-slo-3-eindrapport.pdf (2013). Accessed 8 March 2017
  • De Vaus D. Research Design in Social Research. London: Sage; 2001. [ Google Scholar ]
  • DeVellis RF. Scale Development: Theory and Applications. 3. Los Angeles: Sage; 2012. [ Google Scholar ]
  • Dillman DA. Mail and Internet Surveys. 2. Hobroken: Wiley; 2007. [ Google Scholar ]
  • Falk-Krzesinski HJ, Borner K, Contractor N, Fiore SM, Hall KL, Keyton J, Uzzi B, et al. Advancing the science of team science. CTS Clin. Transl. Sci. 2010; 3 (5):263–266. doi: 10.1111/j.1752-8062.2010.00223.x. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fanelli D. How many scientists fabricate and falsify research? A systematic review and metaanalysis of survey data. PLoS ONE. 2009 [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Fanelli D. Positive results increase down the hierarchy of the sciences. PLoS ONE. 2010 [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Fanelli D, Ioannidis JPA. US studies may overestimate effect sizes in softer research. Proc. Natl. Acad. Sci. USA. 2013; 110 (37):15031–15036. doi: 10.1073/pnas.1302997110. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Fetters MD, Molina-Azorin JF. The journal of mixed methods research starts a new decade: principles for bringing in the new and divesting of the old language of the field. J. Mixed Methods Res. 2017; 11 (1):3–10. doi: 10.1177/1558689816682092. [ CrossRef ] [ Google Scholar ]
  • Fischer ARH, Tobi H, Ronteltap A. When natural met social: a review of collaboration between the natural and social sciences. Interdiscip. Sci. Rev. 2011; 36 (4):341–358. doi: 10.1179/030801811X13160755918688. [ CrossRef ] [ Google Scholar ]
  • Flick U. An Introduction to Qualitative Research. 3. London: Sage; 2006. [ Google Scholar ]
  • Fox MF. Fraud, ethics, and the disciplinary contexts of science and scholarship. Am. Sociol. 1990; 21 (1):67–71. doi: 10.1007/BF02691783. [ CrossRef ] [ Google Scholar ]
  • Frischknecht PM. Environmental science education at the Swiss Federal Institute of Technology (ETH) Water Sci. Technol. 2000; 41 (2):31–36. [ PubMed ] [ Google Scholar ]
  • Godfray HCJ, Beddington JR, Crute IR, Haddad L, Lawrence D, Muir JF, Pretty J, Robinson S, Thomas SM, Toulmin C. Food security: the challenge of feeding 9 billion people. Science. 2010; 327 (5967):812–818. doi: 10.1126/science.1185383. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Greene JC. Is mixed methods social inquiry a distinctive methodology? J. Mixed Methods Res. 2008; 2 (1):7–22. doi: 10.1177/1558689807309969. [ CrossRef ] [ Google Scholar ]
  • IPCC.: Climate Change 2014 Synthesis Report. Geneva: Intergovernmental Panel on Climate Change. www.ipcc.ch/pdf/assessment-report/ar5/syr/SYR_AR5_FINAL_full_wcover.pdf (2015) Accessed 8 March 2017
  • John LK, Loewenstein G, Prelec D. Measuring the prevalence of questionable research practices with incentives for truth telling. Psychol. Sci. 2012; 23 (5):524–532. doi: 10.1177/0956797611430953. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kagan J. The Three Cultures: Natural Sciences, Social Sciences and the Humanities in the 21st Century. Cambridge: Cambridge University Press; 2009. [ Google Scholar ]
  • Kampen JK, Tamás P. Should I take this seriously? A simple checklist for calling bullshit on policy supporting research. Qual. Quant. 2014; 48 :1213–1223. doi: 10.1007/s11135-013-9830-8. [ CrossRef ] [ Google Scholar ]
  • Kumar R. Research Methodology: A Step-by-Step Guide for Beginners. 1. Los Angeles: Sage; 1999. [ Google Scholar ]
  • Kumar R. Research Methodology: A Step-by-Step Guide for Beginners. 4. Los Angeles: Sage; 2014. [ Google Scholar ]
  • Kvale S. Doing Interviews. London: Sage; 2007. [ Google Scholar ]
  • Kvale S, Brinkmann S. Interviews: Learning the Craft of Qualitative Interviews. 2. London: Sage; 2009. [ Google Scholar ]
  • Lenzholder S, Duchhart I, van den Brink A. The relationship between research design. In: van den Brink A, Bruns D, Tobi H, Bell S, editors. Research in Landscape Architecture: Methods and Methodology. New York: Routledge; 2017. pp. 54–64. [ Google Scholar ]
  • Molina-Azorin JF, Lopez-Gamero MD. Mixed methods studies in environmental management research: prevalence, purposes and designs. Bus. Strateg. Environ. 2016; 25 (2):134–148. doi: 10.1002/bse.1862. [ CrossRef ] [ Google Scholar ]
  • Onwuegbuzie AJ, Leech NL. Taking the “Q” out of research: teaching research methodology courses without the divide between quantitative and qualitative paradigms. Qual. Quant. 2005; 39 (3):267–296. doi: 10.1007/s11135-004-1670-0. [ CrossRef ] [ Google Scholar ]
  • Powell H, Mihalas S, Onwuegbuzie AJ, Suldo S, Daley CE. Mixed methods research in school psychology: a mixed methods investigation of trends in the literature. Psychol. Sch. 2008; 45 (4):291–309. doi: 10.1002/pits.20296. [ CrossRef ] [ Google Scholar ]
  • Shipley B. Cause and Correlation in Biology. 2. Cambridge: Cambridge University Press; 2016. [ Google Scholar ]
  • Simmons JP, Nelson LD, Simonsohn U. False positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 2011; 22 :1359–1366. doi: 10.1177/0956797611417632. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Spelt EJH, Biemans HJA, Tobi H, Luning PA, Mulder M. Teaching and learning in interdisciplinary higher education: a systematic review. Educ. Psychol. Rev. 2009; 21 (4):365–378. doi: 10.1007/s10648-009-9113-z. [ CrossRef ] [ Google Scholar ]
  • Spradley JP. The Ethnographic Interview. New York: Holt, Rinehart and Winston; 1979. [ Google Scholar ]
  • Steneck NH. Fostering integrity in research: definitions, current knowledge, and future directions. Sci. Eng. Eth. 2006; 12 (1):53–74. doi: 10.1007/s11948-006-0006-y. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tobi H. Measurement in interdisciplinary research: the contributions of widely-defined measurement and portfolio representations. Measurement. 2014; 48 :228–231. doi: 10.1016/j.measurement.2013.11.013. [ CrossRef ] [ Google Scholar ]
  • Tobi H, Kampen JK. Survey error in an international context: an empirical assessment of crosscultural differences regarding scale effects. Qual. Quant. 2013; 47 (1):553–559. doi: 10.1007/s11135-011-9476-3. [ CrossRef ] [ Google Scholar ]
  • Tobi H, van den Brink A. A process approach to research in landscape architecture. In: van den Brink A, Bruns D, Tobi H, Bell S, editors. Research in Landscape Architecture: Methods and Methodology. New York: Routledge; 2017. pp. 24–34. [ Google Scholar ]
  • van Strien PJ. Praktijk als wetenschap: Methodologie van het sociaal-wetenschappelijk handelen [Practice as science. Methodology of social scientific acting.] Assen: Van Gorcum; 1986. [ Google Scholar ]
  • Venkatesh V, Brown SA, Bala H. Bridging the qualitative-quantitative divide: guidelines for conducting mixed methods research in information systems. MIS Q. 2013; 37 (1):21–54. doi: 10.25300/MISQ/2013/37.1.02. [ CrossRef ] [ Google Scholar ]
  • Vuye C, Bergiers A, Vanhooreweder B. The acoustical durability of thin noise reducing asphalt layers. Coatings. 2016 [ Google Scholar ]
  • Walmsley J, Johnson K. Inclusive Research with People with Learning Disabilities: Past, Present and Futures. London: Jessica Kingsley; 2003. [ Google Scholar ]
  • Walsh WB, Smith GL, London M. Developing an interface between engineering and social sciences- interdisciplinary team-approach to solving societal problems. Am. Psychol. 1975; 30 (11):1067–1071. doi: 10.1037/0003-066X.30.11.1067. [ CrossRef ] [ Google Scholar ]
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • QuestionPro

survey software icon

  • Solutions Industries Gaming Automotive Sports and events Education Government Travel & Hospitality Financial Services Healthcare Cannabis Technology Use Case NPS+ Communities Audience Contactless surveys Mobile LivePolls Member Experience GDPR Positive People Science 360 Feedback Surveys
  • Resources Blog eBooks Survey Templates Case Studies Training Help center

research design and methodology definition

Home Market Research Research Tools and Apps

Research Design: What it is, Elements & Types

Research Design

Can you imagine doing research without a plan? Probably not. When we discuss a strategy to collect, study, and evaluate data, we talk about research design. This design addresses problems and creates a consistent and logical model for data analysis. Let’s learn more about it.

What is Research Design?

Research design is the framework of research methods and techniques chosen by a researcher to conduct a study. The design allows researchers to sharpen the research methods suitable for the subject matter and set up their studies for success.

Creating a research topic explains the type of research (experimental,  survey research ,  correlational , semi-experimental, review) and its sub-type (experimental design, research problem , descriptive case-study). 

There are three main types of designs for research:

  • Data collection
  • Measurement
  • Data Analysis

The research problem an organization faces will determine the design, not vice-versa. The design phase of a study determines which tools to use and how they are used.

The Process of Research Design

The research design process is a systematic and structured approach to conducting research. The process is essential to ensure that the study is valid, reliable, and produces meaningful results.

  • Consider your aims and approaches: Determine the research questions and objectives, and identify the theoretical framework and methodology for the study.
  • Choose a type of Research Design: Select the appropriate research design, such as experimental, correlational, survey, case study, or ethnographic, based on the research questions and objectives.
  • Identify your population and sampling method: Determine the target population and sample size, and choose the sampling method, such as random , stratified random sampling , or convenience sampling.
  • Choose your data collection methods: Decide on the data collection methods , such as surveys, interviews, observations, or experiments, and select the appropriate instruments or tools for collecting data.
  • Plan your data collection procedures: Develop a plan for data collection, including the timeframe, location, and personnel involved, and ensure ethical considerations.
  • Decide on your data analysis strategies: Select the appropriate data analysis techniques, such as statistical analysis , content analysis, or discourse analysis, and plan how to interpret the results.

The process of research design is a critical step in conducting research. By following the steps of research design, researchers can ensure that their study is well-planned, ethical, and rigorous.

Research Design Elements

Impactful research usually creates a minimum bias in data and increases trust in the accuracy of collected data. A design that produces the slightest margin of error in experimental research is generally considered the desired outcome. The essential elements are:

  • Accurate purpose statement
  • Techniques to be implemented for collecting and analyzing research
  • The method applied for analyzing collected details
  • Type of research methodology
  • Probable objections to research
  • Settings for the research study
  • Measurement of analysis

Characteristics of Research Design

A proper design sets your study up for success. Successful research studies provide insights that are accurate and unbiased. You’ll need to create a survey that meets all of the main characteristics of a design. There are four key characteristics:

Characteristics of Research Design

  • Neutrality: When you set up your study, you may have to make assumptions about the data you expect to collect. The results projected in the research should be free from research bias and neutral. Understand opinions about the final evaluated scores and conclusions from multiple individuals and consider those who agree with the results.
  • Reliability: With regularly conducted research, the researcher expects similar results every time. You’ll only be able to reach the desired results if your design is reliable. Your plan should indicate how to form research questions to ensure the standard of results.
  • Validity: There are multiple measuring tools available. However, the only correct measuring tools are those which help a researcher in gauging results according to the objective of the research. The  questionnaire  developed from this design will then be valid.
  • Generalization:  The outcome of your design should apply to a population and not just a restricted sample . A generalized method implies that your survey can be conducted on any part of a population with similar accuracy.

The above factors affect how respondents answer the research questions, so they should balance all the above characteristics in a good design. If you want, you can also learn about Selection Bias through our blog.

Research Design Types

A researcher must clearly understand the various types to select which model to implement for a study. Like the research itself, the design of your analysis can be broadly classified into quantitative and qualitative.

Qualitative research

Qualitative research determines relationships between collected data and observations based on mathematical calculations. Statistical methods can prove or disprove theories related to a naturally existing phenomenon. Researchers rely on qualitative observation research methods that conclude “why” a particular theory exists and “what” respondents have to say about it.

Quantitative research

Quantitative research is for cases where statistical conclusions to collect actionable insights are essential. Numbers provide a better perspective for making critical business decisions. Quantitative research methods are necessary for the growth of any organization. Insights drawn from complex numerical data and analysis prove to be highly effective when making decisions about the business’s future.

Qualitative Research vs Quantitative Research

Here is a chart that highlights the major differences between qualitative and quantitative research:

In summary or analysis , the step of qualitative research is more exploratory and focuses on understanding the subjective experiences of individuals, while quantitative research is more focused on objective data and statistical analysis.

You can further break down the types of research design into five categories:

types of research design

1. Descriptive: In a descriptive composition, a researcher is solely interested in describing the situation or case under their research study. It is a theory-based design method created by gathering, analyzing, and presenting collected data. This allows a researcher to provide insights into the why and how of research. Descriptive design helps others better understand the need for the research. If the problem statement is not clear, you can conduct exploratory research. 

2. Experimental: Experimental research establishes a relationship between the cause and effect of a situation. It is a causal research design where one observes the impact caused by the independent variable on the dependent variable. For example, one monitors the influence of an independent variable such as a price on a dependent variable such as customer satisfaction or brand loyalty. It is an efficient research method as it contributes to solving a problem.

The independent variables are manipulated to monitor the change it has on the dependent variable. Social sciences often use it to observe human behavior by analyzing two groups. Researchers can have participants change their actions and study how the people around them react to understand social psychology better.

3. Correlational research: Correlational research  is a non-experimental research technique. It helps researchers establish a relationship between two closely connected variables. There is no assumption while evaluating a relationship between two other variables, and statistical analysis techniques calculate the relationship between them. This type of research requires two different groups.

A correlation coefficient determines the correlation between two variables whose values range between -1 and +1. If the correlation coefficient is towards +1, it indicates a positive relationship between the variables, and -1 means a negative relationship between the two variables. 

4. Diagnostic research: In diagnostic design, the researcher is looking to evaluate the underlying cause of a specific topic or phenomenon. This method helps one learn more about the factors that create troublesome situations. 

This design has three parts of the research:

  • Inception of the issue
  • Diagnosis of the issue
  • Solution for the issue

5. Explanatory research : Explanatory design uses a researcher’s ideas and thoughts on a subject to further explore their theories. The study explains unexplored aspects of a subject and details the research questions’ what, how, and why.

Benefits of Research Design

There are several benefits of having a well-designed research plan. Including:

  • Clarity of research objectives: Research design provides a clear understanding of the research objectives and the desired outcomes.
  • Increased validity and reliability: To ensure the validity and reliability of results, research design help to minimize the risk of bias and helps to control extraneous variables.
  • Improved data collection: Research design helps to ensure that the proper data is collected and data is collected systematically and consistently.
  • Better data analysis: Research design helps ensure that the collected data can be analyzed effectively, providing meaningful insights and conclusions.
  • Improved communication: A well-designed research helps ensure the results are clean and influential within the research team and external stakeholders.
  • Efficient use of resources: reducing the risk of waste and maximizing the impact of the research, research design helps to ensure that resources are used efficiently.

A well-designed research plan is essential for successful research, providing clear and meaningful insights and ensuring that resources are practical.

QuestionPro offers a comprehensive solution for researchers looking to conduct research. With its user-friendly interface, robust data collection and analysis tools, and the ability to integrate results from multiple sources, QuestionPro provides a versatile platform for designing and executing research projects.

Our robust suite of research tools provides you with all you need to derive research results. Our online survey platform includes custom point-and-click logic and advanced question types. Uncover the insights that matter the most.

FREE TRIAL         LEARN MORE

MORE LIKE THIS

NPS Survey Platform

NPS Survey Platform: Types, Tips, 11 Best Platforms & Tools

Apr 26, 2024

user journey vs user flow

User Journey vs User Flow: Differences and Similarities

gap analysis tools

Best 7 Gap Analysis Tools to Empower Your Business

Apr 25, 2024

employee survey tools

12 Best Employee Survey Tools for Organizational Excellence

Other categories.

  • Academic Research
  • Artificial Intelligence
  • Assessments
  • Brand Awareness
  • Case Studies
  • Communities
  • Consumer Insights
  • Customer effort score
  • Customer Engagement
  • Customer Experience
  • Customer Loyalty
  • Customer Research
  • Customer Satisfaction
  • Employee Benefits
  • Employee Engagement
  • Employee Retention
  • Friday Five
  • General Data Protection Regulation
  • Insights Hub
  • Life@QuestionPro
  • Market Research
  • Mobile diaries
  • Mobile Surveys
  • New Features
  • Online Communities
  • Question Types
  • Questionnaire
  • QuestionPro Products
  • Release Notes
  • Research Tools and Apps
  • Revenue at Risk
  • Survey Templates
  • Training Tips
  • Uncategorized
  • Video Learning Series
  • What’s Coming Up
  • Workforce Intelligence
  • Privacy Policy

Research Method

Home » Research Methodology – Types, Examples and writing Guide

Research Methodology – Types, Examples and writing Guide

Table of Contents

Research Methodology

Research Methodology

Definition:

Research Methodology refers to the systematic and scientific approach used to conduct research, investigate problems, and gather data and information for a specific purpose. It involves the techniques and procedures used to identify, collect , analyze , and interpret data to answer research questions or solve research problems . Moreover, They are philosophical and theoretical frameworks that guide the research process.

Structure of Research Methodology

Research methodology formats can vary depending on the specific requirements of the research project, but the following is a basic example of a structure for a research methodology section:

I. Introduction

  • Provide an overview of the research problem and the need for a research methodology section
  • Outline the main research questions and objectives

II. Research Design

  • Explain the research design chosen and why it is appropriate for the research question(s) and objectives
  • Discuss any alternative research designs considered and why they were not chosen
  • Describe the research setting and participants (if applicable)

III. Data Collection Methods

  • Describe the methods used to collect data (e.g., surveys, interviews, observations)
  • Explain how the data collection methods were chosen and why they are appropriate for the research question(s) and objectives
  • Detail any procedures or instruments used for data collection

IV. Data Analysis Methods

  • Describe the methods used to analyze the data (e.g., statistical analysis, content analysis )
  • Explain how the data analysis methods were chosen and why they are appropriate for the research question(s) and objectives
  • Detail any procedures or software used for data analysis

V. Ethical Considerations

  • Discuss any ethical issues that may arise from the research and how they were addressed
  • Explain how informed consent was obtained (if applicable)
  • Detail any measures taken to ensure confidentiality and anonymity

VI. Limitations

  • Identify any potential limitations of the research methodology and how they may impact the results and conclusions

VII. Conclusion

  • Summarize the key aspects of the research methodology section
  • Explain how the research methodology addresses the research question(s) and objectives

Research Methodology Types

Types of Research Methodology are as follows:

Quantitative Research Methodology

This is a research methodology that involves the collection and analysis of numerical data using statistical methods. This type of research is often used to study cause-and-effect relationships and to make predictions.

Qualitative Research Methodology

This is a research methodology that involves the collection and analysis of non-numerical data such as words, images, and observations. This type of research is often used to explore complex phenomena, to gain an in-depth understanding of a particular topic, and to generate hypotheses.

Mixed-Methods Research Methodology

This is a research methodology that combines elements of both quantitative and qualitative research. This approach can be particularly useful for studies that aim to explore complex phenomena and to provide a more comprehensive understanding of a particular topic.

Case Study Research Methodology

This is a research methodology that involves in-depth examination of a single case or a small number of cases. Case studies are often used in psychology, sociology, and anthropology to gain a detailed understanding of a particular individual or group.

Action Research Methodology

This is a research methodology that involves a collaborative process between researchers and practitioners to identify and solve real-world problems. Action research is often used in education, healthcare, and social work.

Experimental Research Methodology

This is a research methodology that involves the manipulation of one or more independent variables to observe their effects on a dependent variable. Experimental research is often used to study cause-and-effect relationships and to make predictions.

Survey Research Methodology

This is a research methodology that involves the collection of data from a sample of individuals using questionnaires or interviews. Survey research is often used to study attitudes, opinions, and behaviors.

Grounded Theory Research Methodology

This is a research methodology that involves the development of theories based on the data collected during the research process. Grounded theory is often used in sociology and anthropology to generate theories about social phenomena.

Research Methodology Example

An Example of Research Methodology could be the following:

Research Methodology for Investigating the Effectiveness of Cognitive Behavioral Therapy in Reducing Symptoms of Depression in Adults

Introduction:

The aim of this research is to investigate the effectiveness of cognitive-behavioral therapy (CBT) in reducing symptoms of depression in adults. To achieve this objective, a randomized controlled trial (RCT) will be conducted using a mixed-methods approach.

Research Design:

The study will follow a pre-test and post-test design with two groups: an experimental group receiving CBT and a control group receiving no intervention. The study will also include a qualitative component, in which semi-structured interviews will be conducted with a subset of participants to explore their experiences of receiving CBT.

Participants:

Participants will be recruited from community mental health clinics in the local area. The sample will consist of 100 adults aged 18-65 years old who meet the diagnostic criteria for major depressive disorder. Participants will be randomly assigned to either the experimental group or the control group.

Intervention :

The experimental group will receive 12 weekly sessions of CBT, each lasting 60 minutes. The intervention will be delivered by licensed mental health professionals who have been trained in CBT. The control group will receive no intervention during the study period.

Data Collection:

Quantitative data will be collected through the use of standardized measures such as the Beck Depression Inventory-II (BDI-II) and the Generalized Anxiety Disorder-7 (GAD-7). Data will be collected at baseline, immediately after the intervention, and at a 3-month follow-up. Qualitative data will be collected through semi-structured interviews with a subset of participants from the experimental group. The interviews will be conducted at the end of the intervention period, and will explore participants’ experiences of receiving CBT.

Data Analysis:

Quantitative data will be analyzed using descriptive statistics, t-tests, and mixed-model analyses of variance (ANOVA) to assess the effectiveness of the intervention. Qualitative data will be analyzed using thematic analysis to identify common themes and patterns in participants’ experiences of receiving CBT.

Ethical Considerations:

This study will comply with ethical guidelines for research involving human subjects. Participants will provide informed consent before participating in the study, and their privacy and confidentiality will be protected throughout the study. Any adverse events or reactions will be reported and managed appropriately.

Data Management:

All data collected will be kept confidential and stored securely using password-protected databases. Identifying information will be removed from qualitative data transcripts to ensure participants’ anonymity.

Limitations:

One potential limitation of this study is that it only focuses on one type of psychotherapy, CBT, and may not generalize to other types of therapy or interventions. Another limitation is that the study will only include participants from community mental health clinics, which may not be representative of the general population.

Conclusion:

This research aims to investigate the effectiveness of CBT in reducing symptoms of depression in adults. By using a randomized controlled trial and a mixed-methods approach, the study will provide valuable insights into the mechanisms underlying the relationship between CBT and depression. The results of this study will have important implications for the development of effective treatments for depression in clinical settings.

How to Write Research Methodology

Writing a research methodology involves explaining the methods and techniques you used to conduct research, collect data, and analyze results. It’s an essential section of any research paper or thesis, as it helps readers understand the validity and reliability of your findings. Here are the steps to write a research methodology:

  • Start by explaining your research question: Begin the methodology section by restating your research question and explaining why it’s important. This helps readers understand the purpose of your research and the rationale behind your methods.
  • Describe your research design: Explain the overall approach you used to conduct research. This could be a qualitative or quantitative research design, experimental or non-experimental, case study or survey, etc. Discuss the advantages and limitations of the chosen design.
  • Discuss your sample: Describe the participants or subjects you included in your study. Include details such as their demographics, sampling method, sample size, and any exclusion criteria used.
  • Describe your data collection methods : Explain how you collected data from your participants. This could include surveys, interviews, observations, questionnaires, or experiments. Include details on how you obtained informed consent, how you administered the tools, and how you minimized the risk of bias.
  • Explain your data analysis techniques: Describe the methods you used to analyze the data you collected. This could include statistical analysis, content analysis, thematic analysis, or discourse analysis. Explain how you dealt with missing data, outliers, and any other issues that arose during the analysis.
  • Discuss the validity and reliability of your research : Explain how you ensured the validity and reliability of your study. This could include measures such as triangulation, member checking, peer review, or inter-coder reliability.
  • Acknowledge any limitations of your research: Discuss any limitations of your study, including any potential threats to validity or generalizability. This helps readers understand the scope of your findings and how they might apply to other contexts.
  • Provide a summary: End the methodology section by summarizing the methods and techniques you used to conduct your research. This provides a clear overview of your research methodology and helps readers understand the process you followed to arrive at your findings.

When to Write Research Methodology

Research methodology is typically written after the research proposal has been approved and before the actual research is conducted. It should be written prior to data collection and analysis, as it provides a clear roadmap for the research project.

The research methodology is an important section of any research paper or thesis, as it describes the methods and procedures that will be used to conduct the research. It should include details about the research design, data collection methods, data analysis techniques, and any ethical considerations.

The methodology should be written in a clear and concise manner, and it should be based on established research practices and standards. It is important to provide enough detail so that the reader can understand how the research was conducted and evaluate the validity of the results.

Applications of Research Methodology

Here are some of the applications of research methodology:

  • To identify the research problem: Research methodology is used to identify the research problem, which is the first step in conducting any research.
  • To design the research: Research methodology helps in designing the research by selecting the appropriate research method, research design, and sampling technique.
  • To collect data: Research methodology provides a systematic approach to collect data from primary and secondary sources.
  • To analyze data: Research methodology helps in analyzing the collected data using various statistical and non-statistical techniques.
  • To test hypotheses: Research methodology provides a framework for testing hypotheses and drawing conclusions based on the analysis of data.
  • To generalize findings: Research methodology helps in generalizing the findings of the research to the target population.
  • To develop theories : Research methodology is used to develop new theories and modify existing theories based on the findings of the research.
  • To evaluate programs and policies : Research methodology is used to evaluate the effectiveness of programs and policies by collecting data and analyzing it.
  • To improve decision-making: Research methodology helps in making informed decisions by providing reliable and valid data.

Purpose of Research Methodology

Research methodology serves several important purposes, including:

  • To guide the research process: Research methodology provides a systematic framework for conducting research. It helps researchers to plan their research, define their research questions, and select appropriate methods and techniques for collecting and analyzing data.
  • To ensure research quality: Research methodology helps researchers to ensure that their research is rigorous, reliable, and valid. It provides guidelines for minimizing bias and error in data collection and analysis, and for ensuring that research findings are accurate and trustworthy.
  • To replicate research: Research methodology provides a clear and detailed account of the research process, making it possible for other researchers to replicate the study and verify its findings.
  • To advance knowledge: Research methodology enables researchers to generate new knowledge and to contribute to the body of knowledge in their field. It provides a means for testing hypotheses, exploring new ideas, and discovering new insights.
  • To inform decision-making: Research methodology provides evidence-based information that can inform policy and decision-making in a variety of fields, including medicine, public health, education, and business.

Advantages of Research Methodology

Research methodology has several advantages that make it a valuable tool for conducting research in various fields. Here are some of the key advantages of research methodology:

  • Systematic and structured approach : Research methodology provides a systematic and structured approach to conducting research, which ensures that the research is conducted in a rigorous and comprehensive manner.
  • Objectivity : Research methodology aims to ensure objectivity in the research process, which means that the research findings are based on evidence and not influenced by personal bias or subjective opinions.
  • Replicability : Research methodology ensures that research can be replicated by other researchers, which is essential for validating research findings and ensuring their accuracy.
  • Reliability : Research methodology aims to ensure that the research findings are reliable, which means that they are consistent and can be depended upon.
  • Validity : Research methodology ensures that the research findings are valid, which means that they accurately reflect the research question or hypothesis being tested.
  • Efficiency : Research methodology provides a structured and efficient way of conducting research, which helps to save time and resources.
  • Flexibility : Research methodology allows researchers to choose the most appropriate research methods and techniques based on the research question, data availability, and other relevant factors.
  • Scope for innovation: Research methodology provides scope for innovation and creativity in designing research studies and developing new research techniques.

Research Methodology Vs Research Methods

About the author.

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Research Paper Citation

How to Cite Research Paper – All Formats and...

Data collection

Data Collection – Methods Types and Examples

Delimitations

Delimitations in Research – Types, Examples and...

Research Paper Formats

Research Paper Format – Types, Examples and...

Research Process

Research Process – Steps, Examples and Tips

Research Design

Research Design – Types, Methods and Examples

IMAGES

  1. Types of Research Methodology: Uses, Types & Benefits

    research design and methodology definition

  2. 15 Research Methodology Examples (2023)

    research design and methodology definition

  3. 25 Types of Research Designs (2024)

    research design and methodology definition

  4. Types of Research Methodology: Uses, Types & Benefits

    research design and methodology definition

  5. What is Research

    research design and methodology definition

  6. Steps for preparing research methodology

    research design and methodology definition

VIDEO

  1. Research Methodology-2

  2. Research Design, Research Method: What's the Difference?

  3. What is research design? #how to design a research advantages of research design

  4. Research Lecture 2 Research Methodology

  5. Research design in research methodology bba

  6. Research Design, Methodology & Methods

COMMENTS

  1. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  2. Research Design

    Definition: Research design refers to the overall strategy or plan for conducting a research study. It outlines the methods and procedures that will be used to collect and analyze data, as well as the goals and objectives of the study. ... Research Design Vs Research Methodology. Research Design Research Methodology; The plan and structure for ...

  3. What Is Research Methodology? Definition + Examples

    As we mentioned, research methodology refers to the collection of practical decisions regarding what data you'll collect, from who, how you'll collect it and how you'll analyse it. Research design, on the other hand, is more about the overall strategy you'll adopt in your study. For example, whether you'll use an experimental design ...

  4. What is a Research Design? Definition, Types, Methods and Examples

    Research design methods refer to the systematic approaches and techniques used to plan, structure, and conduct a research study. The choice of research design method depends on the research questions, objectives, and the nature of the study. Here are some key research design methods commonly used in various fields: 1.

  5. What Is Research Design? 8 Types + Examples

    Research design refers to the overall plan, structure or strategy that guides a research project, from its conception to the final analysis of data. Research designs for quantitative studies include descriptive, correlational, experimental and quasi-experimenta l designs. Research designs for qualitative studies include phenomenological ...

  6. A tutorial on methodological studies: the what, when, how and why

    In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts). In the past 10 years, there has been an increase in the use of terms related to ...

  7. Research Design

    Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies. Frequently asked questions.

  8. Research Design and Methodology

    There are a number of approaches used in this research method design. The purpose of this chapter is to design the methodology of the research approach through mixed types of research techniques. The research approach also supports the researcher on how to come across the research result findings. In this chapter, the general design of the research and the methods used for data collection are ...

  9. Study designs: Part 1

    Research study design is a framework, or the set of methods and procedures used to collect and analyze data on variables specified in a particular research problem. Research study designs are of many types, each with its advantages and limitations. The type of study design used to answer a particular research question is determined by the ...

  10. What Is a Research Methodology?

    Step 1: Explain your methodological approach. Step 2: Describe your data collection methods. Step 3: Describe your analysis method. Step 4: Evaluate and justify the methodological choices you made. Tips for writing a strong methodology chapter. Other interesting articles. Frequently asked questions about methodology.

  11. PDF Chapter 1 Introduction to Research Methodology

    The research design is a fundamental aspect of research methodology, outlining the overall strategy and structure of the study. It includes decisions regarding the research type (e.g., descriptive, experimental), the selection of variables, and the determination of the study's scope and timeframe. We must carefully consider the design to ...

  12. Research Design and Methodology

    Each research methodology and design presents certain advantages and at the same time, also denotes intrinsic weaknesses. A pluralist methodological approach was designed to pursue a study that would generate generalisable findings by a combination of action research and experimental CG design. ... while a clear definition of the "research ...

  13. What is Research Methodology? Definition, Types, and Examples

    Definition, Types, and Examples. Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of ...

  14. Clarification of research design, research methods, and research

    In identifying similarities and differences across the scholars' approaches, the analysis includes: (a) definitions of and steps in research design, and (b) the perspectives on research methods and research methodology. The analysis showed that research design approaches are convergent and divergent and that it is necessary for PA researchers ...

  15. Research Design: What is Research Design, Types, Methods, and Examples

    There are various types of research design, each suited to different research questions and objectives: • Quantitative Research: Focuses on numerical data and statistical analysis to quantify relationships and patterns. Common methods include surveys, experiments, and observational studies. • Qualitative Research: Emphasizes understanding ...

  16. Planning Qualitative Research: Design and Decision Making for New

    While many books and articles guide various qualitative research methods and analyses, there is currently no concise resource that explains and differentiates among the most common qualitative approaches. We believe novice qualitative researchers, students planning the design of a qualitative study or taking an introductory qualitative research course, and faculty teaching such courses can ...

  17. Research Design and Methodology

    To ensure a strong research design, researchers must choose a research paradigm that is congruent with their beliefs about the nature of reality. Consciously subjecting such beliefs to an ontological interrogation in the first instance will illuminate the epistemological and methodological possibilities that are available. (Mills et al. 2006, 2 ...

  18. Research design: the methodology for interdisciplinary research

    The research is based upon a conceptual model that links or integrates theoretical frameworks from those disciplines, uses study design and methodology that is not limited to any one field, and requires the use of perspectives and skills of the involved disciplines throughout multiple phases of the research process.

  19. Research Design: What it is, Elements & Types

    Research design is the framework of research methods and techniques chosen by a researcher to conduct a study. The design allows researchers to sharpen the research methods suitable for the subject matter and set up their studies for success. Creating a research topic explains the type of research (experimental,survey research,correlational ...

  20. Research Methodology

    Definition: Research Methodology refers to the systematic and scientific approach used to conduct research, investigate problems, and gather data and information for a specific purpose. ... To design the research: Research methodology helps in designing the research by selecting the appropriate research method, research design, and sampling ...

  21. (PDF) CHAPTER FIVE RESEARCH DESIGN AND METHODOLOGY 5.1. Introduction

    This chapter discusses in detail the methodological choice and the research design process of the study. It has mainly relied on the philosophical stance and the research problem to guide on the ...

  22. (PDF) Research Design and Methodology

    Research methodology implies the path through which the researcher intends to conduct research and the path to follow while formulating problems and objectives and presenting the result from data ...