Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Mixed Methods Research | Definition, Guide & Examples

Mixed Methods Research | Definition, Guide & Examples

Published on August 13, 2021 by Tegan George . Revised on June 22, 2023.

Mixed methods research combines elements of quantitative research and qualitative research in order to answer your research question . Mixed methods can help you gain a more complete picture than a standalone quantitative or qualitative study, as it integrates benefits of both methods.

Mixed methods research is often used in the behavioral, health, and social sciences, especially in multidisciplinary settings and complex situational or societal research.

  • To what extent does the frequency of traffic accidents ( quantitative ) reflect cyclist perceptions of road safety ( qualitative ) in Amsterdam?
  • How do student perceptions of their school environment ( qualitative ) relate to differences in test scores ( quantitative ) ?
  • How do interviews about job satisfaction at Company X ( qualitative ) help explain year-over-year sales performance and other KPIs ( quantitative ) ?
  • How can voter and non-voter beliefs about democracy ( qualitative ) help explain election turnout patterns ( quantitative ) in Town X?
  • How do average hospital salary measurements over time (quantitative) help to explain nurse testimonials about job satisfaction (qualitative) ?

Table of contents

When to use mixed methods research, mixed methods research designs, advantages of mixed methods research, disadvantages of mixed methods research, other interesting articles, frequently asked questions.

Mixed methods research may be the right choice if your research process suggests that quantitative or qualitative data alone will not sufficiently answer your research question. There are several common reasons for using mixed methods research:

  • Generalizability : Qualitative research usually has a smaller sample size , and thus is not generalizable. In mixed methods research, this comparative weakness is mitigated by the comparative strength of “large N,” externally valid quantitative research.
  • Contextualization: Mixing methods allows you to put findings in context and add richer detail to your conclusions. Using qualitative data to illustrate quantitative findings can help “put meat on the bones” of your analysis.
  • Credibility: Using different methods to collect data on the same subject can make your results more credible. If the qualitative and quantitative data converge, this strengthens the validity of your conclusions. This process is called triangulation .

As you formulate your research question , try to directly address how qualitative and quantitative methods will be combined in your study. If your research question can be sufficiently answered via standalone quantitative or qualitative analysis, a mixed methods approach may not be the right fit.

But mixed methods might be a good choice if you want to meaningfully integrate both of these questions in one research study.

Keep in mind that mixed methods research doesn’t just mean collecting both types of data; you need to carefully consider the relationship between the two and how you’ll integrate them into coherent conclusions.

Mixed methods can be very challenging to put into practice, and comes with the same risk of research biases as standalone studies, so it’s a less common choice than standalone qualitative or qualitative research.

Prevent plagiarism. Run a free check.

There are different types of mixed methods research designs . The differences between them relate to the aim of the research, the timing of the data collection , and the importance given to each data type.

As you design your mixed methods study, also keep in mind:

  • Your research approach ( inductive vs deductive )
  • Your research questions
  • What kind of data is already available for you to use
  • What kind of data you’re able to collect yourself.

Here are a few of the most common mixed methods designs.

Convergent parallel

In a convergent parallel design, you collect quantitative and qualitative data at the same time and analyze them separately. After both analyses are complete, compare your results to draw overall conclusions.

  • On the qualitative side, you analyze cyclist complaints via the city’s database and on social media to find out which areas are perceived as dangerous and why.
  • On the quantitative side, you analyze accident reports in the city’s database to find out how frequently accidents occur in different areas of the city.

In an embedded design, you collect and analyze both types of data at the same time, but within a larger quantitative or qualitative design. One type of data is secondary to the other.

This is a good approach to take if you have limited time or resources. You can use an embedded design to strengthen or supplement your conclusions from the primary type of research design.

Explanatory sequential

In an explanatory sequential design, your quantitative data collection and analysis occurs first, followed by qualitative data collection and analysis.

You should use this design if you think your qualitative data will explain and contextualize your quantitative findings.

Exploratory sequential

In an exploratory sequential design, qualitative data collection and analysis occurs first, followed by quantitative data collection and analysis.

You can use this design to first explore initial questions and develop hypotheses . Then you can use the quantitative data to test or confirm your qualitative findings.

“Best of both worlds” analysis

Combining the two types of data means you benefit from both the detailed, contextualized insights of qualitative data and the generalizable , externally valid insights of quantitative data. The strengths of one type of data often mitigate the weaknesses of the other.

For example, solely quantitative studies often struggle to incorporate the lived experiences of your participants, so adding qualitative data deepens and enriches your quantitative results.

Solely qualitative studies are often not very generalizable, only reflecting the experiences of your participants, so adding quantitative data can validate your qualitative findings.

Method flexibility

Mixed methods are less tied to disciplines and established research paradigms. They offer more flexibility in designing your research, allowing you to combine aspects of different types of studies to distill the most informative results.

Mixed methods research can also combine theory generation and hypothesis testing within a single study, which is unusual for standalone qualitative or quantitative studies.

Mixed methods research is very labor-intensive. Collecting, analyzing, and synthesizing two types of data into one research product takes a lot of time and effort, and often involves interdisciplinary teams of researchers rather than individuals. For this reason, mixed methods research has the potential to cost much more than standalone studies.

Differing or conflicting results

If your analysis yields conflicting results, it can be very challenging to know how to interpret them in a mixed methods study. If the quantitative and qualitative results do not agree or you are concerned you may have confounding variables , it can be unclear how to proceed.

Due to the fact that quantitative and qualitative data take two vastly different forms, it can also be difficult to find ways to systematically compare the results, putting your data at risk for bias in the interpretation stage.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

In mixed methods research , you use both qualitative and quantitative data collection and analysis methods to answer your research question .

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

Triangulation in research means using multiple datasets, methods, theories and/or investigators to address a research question. It’s a research strategy that can help you enhance the validity and credibility of your findings.

Triangulation is mainly used in qualitative research , but it’s also commonly applied in quantitative research . Mixed methods research always uses triangulation.

These are four of the most common mixed methods designs :

  • Convergent parallel: Quantitative and qualitative data are collected at the same time and analyzed separately. After both analyses are complete, compare your results to draw overall conclusions. 
  • Embedded: Quantitative and qualitative data are collected at the same time, but within a larger quantitative or qualitative design. One type of data is secondary to the other.
  • Explanatory sequential: Quantitative data is collected and analyzed first, followed by qualitative data. You can use this design if you think your qualitative data will explain and contextualize your quantitative findings.
  • Exploratory sequential: Qualitative data is collected and analyzed first, followed by quantitative data. You can use this design if you think the quantitative data will confirm or validate your qualitative findings.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

George, T. (2023, June 22). Mixed Methods Research | Definition, Guide & Examples. Scribbr. Retrieved April 15, 2024, from https://www.scribbr.com/methodology/mixed-methods-research/

Is this article helpful?

Tegan George

Tegan George

Other students also liked, writing strong research questions | criteria & examples, what is quantitative research | definition, uses & methods, what is qualitative research | methods & examples, unlimited academic ai-proofreading.

✔ Document error-free in 5minutes ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Mixed-Methods Designs

  • First Online: 28 November 2020

Cite this chapter

Book cover

  • Martino Maggetti 4  

19k Accesses

8 Citations

This chapter focuses on mixed-method designs, an increasingly popular approach to designing research in the social sciences that is used to combine the respective advantages of qualitative and quantitative analytical procedures and to strengthen the empirical analysis. After the introduction, two general principles of mixed designs are discussed, the principle of triangulation and the principle of integration. The former involves the concomitant application of different methods in order to cross-validate their findings. The latter entails the sequential combination of different methods to produce a unified causal inference, whereby one method is used to establish the final inference, and the other one is applied to prepare, test, qualify or refine the analysis generating this inference. Afterwards, the chapter proceeds by presenting three varieties of mixed-method studies: statistics-oriented, case-oriented and QCA-based mixed-methods designs. The last section before concluding discusses several advantages and limitations of mixed-method research.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

I would like to thank Ina Kubbe for helpful comments on this chapter.

As this chapter has a methodological focus, this question will not be treated.

Qualitative Comparative Analysis (QCA) deserves a separate treatment as it provides a distinctive approach to mixing methods. Please see the chapter by Wagemann and Siewert.

Allison, Graham Tillett. 1971. Essence of decision: Explaining the Cuban missile crisis. Boston: Little, Brown and Company.

Google Scholar  

Ayoub, Phillip M, Sophia Wallace, and Chris Zepeda-Millán. 2014. Triangulation in social movement research. In Della Porta, D. (Ed.), Methodological practices in social movement research (pp. 67–96). Oxford, UK: Oxford University Press.

Beck, Nathaniel. 2006. Is causal-process observation an oxymoron? Political Analysis 14(3): 347–352.

Article   Google Scholar  

Bennett, Andrew, and Jeffrey T Checkel. 2014. Process tracing: From metaphor to analytic tool. Cambridge: Cambridge University Press.

Brady, Henry. E., David Collier, and Jason Seawright. 2006. Toward a pluralistic vision of methodology. Political Analysis 14(3): 353–368.

Bryman, Alan. 2006a. Integrating quantitative and qualitative research: How is it done? Qualitative Research 6(1): 97–113.

Bryman, Alan. 2006b. Mixed methods . Thousand Oaks: Sage.

Bryman, Alan. 2008. The end of the paradigm wars?, In Alasuutari P, Bickman L and Brannen J eds. The SAGE Handbook of Social Research Methods , Sage, London, pp. 13–25.

Clark, Terry Nichols, and Seymour Martin Lipset, eds. 2001. The breakdown of class politics. A debate on post-industrial stratification . Baltimore: Johns Hopkins Press.

Coleman, James S. 1990. Foundations of social theory . Cambridge, MA/London: Harvard University Press.

Collier, David, Henry E Brady, and Jason Seawright. 2010. Outdated views of qualitative methods: time to move on. Political Analysis 18(4): 506–513.

Coppedge, Michael. 2005. Explaining democratic deterioration in Venezuela through nested inference. In The third wave of democratization in Latin America , eds. Frances Hagopian and Scott Mainwaring, 289–316. Cambridge: Cambridge University Press.

Creswell, John W. 2013. Research design: Qualitative, quantitative, and mixed methods approaches . Thousand Oaks: Sage.

Creswell, John W. 2014. A concise introduction to mixed methods research. Thousand Oaks: Sage.

Creswell, John W, and Vicki L Plano Clark. 2011. Designing and conducting mixed methods research . Thousand Oaks: Sage.

Denzin, Norman K. 1973. The research act: A theoretical introduction to sociological methods. Piscataway: Transaction publishers.

Denzin, Norman K, and Yvonna S Lincoln. 2011. The Sage handbook of qualitative research. Thousand Oaks: Sage.

Dunning, Thad, and Lauren Harrison. 2010. Cross-cutting cleavages and ethnic voting: An experimental study of cousinage in Mali. American Political Science Review 104(1): 21–39.

Fearon, James D, and David D Laitin. 2008. Integrating Qualitative and Quantitative Methods. In Janet Box-Steffensmeier, Henry Brady, and David Collier, eds. Oxford Handbook of Political Methodology . New York: Oxford University Press, pp. 756–76.

Feilzer, Martina Yvonne. 2010. Doing mixed methods research pragmatically: Implications for the rediscovery of pragmatism as a research paradigm. Journal of Mixed Methods Research 4(1): 6–16.

George, Alexander L., and Andrew Bennett. 2005. Case studies and theory development in the social sciences . Cambridge, MA: MIT Press.

Gerring, John. 2004. What is a case study and what is it good for? American Political Science Review 98(02): 341–354.

Gerring, John. 2017a. Case study research: Principles and practices, 2nd ed. Cambridge: Cambridge University Press.

Gerring, John. 2017b. Qualitative methods. Annual Review of Political Science 20: 15–36.

Giraud, Olivier, and Martino Maggetti. 2015. Methodological pluralism. In Braun, Dietmar and Martino Maggetti (Ed.) Comparative Politics: Theoretical and Methodological Challenges , eds. Cheltenham, UK: Edward Elgar Publishing, pp. 125–153.

Given, Lisa M. 2008. The Sage encyclopedia of qualitative research methods. Thousand Oaks: Sage.

Glaser, Barney G. 1998. Doing grounded theory: Issues and discussions. Mill Valley: Sociology Press.

Goertz, Gary. 2017. Multimethod research, causal mechanisms, and case studies: An integrated approach . Princeton: Princeton University Press.

Book   Google Scholar  

Greene, Jennifer C, Valerie J Caracelli, and Wendy F Graham. 1989. Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis 11(3): 255–274.

Hedström, Peter, and Petri Ylikoski. 2010. Causal mechanisms in the social sciences. Annual Review of Sociology 36: 49–67.

Hesse-Biber, Sharlene Nagy. 2010. Mixed methods research: Merging theory with practice. New York: The Guilford Press.

Jick, Todd D. 1979. Mixing qualitative and quantitative methods: Triangulation in action. Administrative Science Quarterly 24(4): 602–611.

Jick, Todd D. 2008. Triangulation as the first mixed methods design. In The mixed methods reader , eds. Vicki L. Plano Clark and John W. Cresswell, 105–118. Thousand Oaks: Sage.

Johnson, R. Burke, and Anthony J. Onwuegbuzie. 2004. Mixed methods research: A research paradigm whose time has come. Educational Researcher 33(7): 14–26.

Johnson, R. Burke, Anthony J. Onwuegbuzie, and Lisa A. Turner. 2007. Toward a definition of mixed methods research. Journal of Mixed Methods Research 1(2): 112–133.

Kitchenham, Andrew D. 2010. Mixed methods in case study research. In Encyclopedia of case study research, eds. AJ Mills, G Durepos, and E Wiebe, vol. 1, 561–563. Thousand Oaks: Sage.

Lebow, Richard N. 2000. What’s so different about a counterfactual? World Politics 52(4): 550–585.

Leech, Nancy L, and Anthony J Onwuegbuzie. 2009. A typology of mixed methods research designs. Quality & Quantity 43(2): 265–275.

Levy, Jack S. 2008a. Case studies: Types, designs, and logics of inference. Conflict Management and Peace Science 25(1): 1–18.

Levy, Jack S. 2008b. Counterfactuals and Case Studies. In The Oxford Handbook of Political Methodology , ed. Box-Steffensmeier, Janet M, Brady, Henry E and Collier David, 627–44. New York: Oxford University Press.

Levy, Jack S. 2015. Counterfactuals, causal inference, and historical analysis. Security Studies 24(3): 378–402.

Lieberman, Evan S. 2005. Nested analysis as a mixed-method strategy for comparative research. American Political Science Review 99(3): 435–451.

Maggetti, Martino, Fabrizio Gilardi, and Claudio M. Radaelli. 2013. Designing research in the social sciences . London: Sage.

Mahoney, James, and Rodrigo Barrenechea. 2016. The logic of counterfactual analysis in case-study explanation. The British Journal of Sociology .

Mahoney, James, and Gary Goertz. 2006. A tale of two cultures: Contrasting quantitative and qualitative research. Political Analysis 14(3): 227–249.

Malina, Mary A, Hanne SO Nørreklit, and Frank H Selto. 2011. Lessons learned: Advantages and disadvantages of mixed method research. Qualitative Research in Accounting & Management 8(1): 59–71.

Moran-Ellis, Jo, Victoria D Alexander, Ann Cronin, Mary Dickinson, Jane Fielding, Judith Sleney, and Hilary Thomas. 2006. Triangulation and integration: Processes, claims and implications. Qualitative Research 6(1): 45–59.

Morgan, David L. 2007. Paradigms lost and pragmatism regained: Methodological implications of combining qualitative and quantitative methods. Journal of Mixed Methods Research 1(1): 48–76.

Olsen, Wendy. 2004. Triangulation in social research: Qualitative and quantitative methods can really be mixed. Developments in sociology 20: 103–118.

Plano Clark, Vicki L, and Nataliya V Ivankova. 2015. Mixed methods research: A guide to the field. Thousand Oaks: Sage.

Ragin, Charles C. 1987. The comparative method : Moving beyond qualitative and quantitative strategies . Berkeley: University of California Press.

Ragin, Charles C. 2008. Redesigning social inquiry: Fuzzy sets and beyond . Chicago: University of Chicago Press.

Rihoux, Benoit. 2006. Qualitative comparative analysis (QCA) and related systematic comparative methods: Recent advances and remaining challenges for social science research. International Sociology 21(5): 679–706.

Rihoux, Benoit, and Charles C. Ragin. 2008. Configurational comparative methods. Qualitative comparative analysis (QCA) and related techniques . Thousand Oaks/London: Sage.

Rohlfing, Ingo. 2008. What you see and what you get: Pitfalls and principles of nested analysis in comparative research. Comparative Political Studies 41(11): 1492–1514.

Rohlfing, Ingo, and Carsten Q. Schneider. 2013. Improving research on necessary conditions: Formalized case selection for process tracing after QCA. Political Research Quarterly 66(1): 220–235.

Sartori, Giovanni. 1993. Totalitarianism, model mania and learning from error. Journal of Theoretical Politics 5(1): 5–22.

Sawyer, R. Keith. 2003. Artificial societies: Multiagent systems and the micro-macro link in sociological theory. Sociological Methods & Research 31(3): 325–363.

Schneider, Carsten Q., and Ingo Rohlfing. 2013. Combining QCA and process tracing in set-theoretic multi-method research. Sociological Methods & Research 42(4): 559–597.

Schneider, Carsten Q., and Ingo Rohlfing. 2016. Case studies nested in fuzzy-set QCA on sufficiency: formalizing case selection and causal inference. Sociological Methods & Research 45(3): 526–568.

Schneider, Carsten Q., and C Wagemann. 2010. Standards of good practice in qualitative comparative analysis (QCA) and fuzzy-s. Comparative Sociology 9(3): 397–418.

Schneider, Carsten Q., and Claudius Wagemann. 2012. Set-theoretic methods for the social sciences: A guide to qualitative comparative analysis. Cambridge: Cambridge University Press.

Schram, Sanford F., Bent Flyvbjerg, and Todd Landman. 2013. Political political science: A phronetic approach. New Political Science 35(3): 359–372.

Seawright, Jason. 2016. Multi-method social science: Combining qualitative and quantitative tools , Cambridge: Cambridge University Press.

Seawright, Jason, and John Gerring. 2008. Case selection techniques in case study research: A menu of qualitative and quantitative options. Political Research Quarterly 61(2): 294–307.

Sil, Rudra. 2004. Problems chasing methods or methods chasing problems? Research communities, constrained pluralism, and the role of eclecticism. In Problems and methods in the study of politics , eds. Ian Shapiro, Roger M. Smith and Tarek E. Masoud. Cambridge: Cambridge University Press.

Strauss, Anselm, and Juliet M Corbin. 1997. Grounded theory in practice. Thousand Oaks: Sage.

Tashakkori, Abbas, and John W Creswell. 2007. The new era of mixed methods . Thousand Oaks: Sage.

Tashakkori, Abbas, and Charles Teddlie. 2010. Handbook of mixed method research in the social and behavioral sciences . Thousand Oaks: Sage.

Teddlie, Charles, and Abbas Tashakkori. 2003. Major issues and controveries inthe use of mixed methods in the social and behvioral sciences. In Handbook of mixed methods in social & behavioral research , eds. Abbas Tashakkori and Charles Teddlie, 3–50. Thousand Oaks: Sage.

Teddlie, Charles, and Abbas Tashakkori. 2006. A general typology of research designs featuring mixed methods. Research in the Schools 13(1): 12–28.

Teddlie, Charles, and Fen Yu. 2007. Mixed methods sampling: A typology with examples. Journal of Mixed Methods Research 1(1): 77–100.

Tetlock, Philip E., and Aaron Belkin. 1996. Counterfactual thought experiments in world politics: Logical, methodological, and psychological perspectives. Princeton: Princeton University Press.

Thiem, Alrik, and Adrian Dusa. 2013. Qualitative comparative analysis with R: A user’s guide, Vol. 5. New York: Springer Science & Business Media.

Webb, Eugene J., Donald Thomas Campbell, Richard D. Schwartz, and Lee Sechrest. 1966. Unobtrusive measures: Nonreactive research in the social sciences, Vol. 111. Chicago: Rand McNally.

Woodward, James, and Christopher Hitchcock. 2003. Explanatory generalizations, part I: A counterfactual account. Noûs 37(1): 1–24.

Download references

Author information

Authors and affiliations.

Institut d’Etudes Politiques (IEP), University of Lausanne, Lausanne, Schweiz

Martino Maggetti

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Martino Maggetti .

Editor information

Editors and affiliations.

Goethe-Universität Frankfurt, Frankfurt am Main, Germany

Claudius Wagemann

Institut für Politikwissenschaft (IfP), Universität Duisburg-Essen, Duisburg, Germany

Achim Goerres

Hochschule für Politik München, TU München, München, Germany

Markus B. Siewert

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature

About this chapter

Maggetti, M. (2020). Mixed-Methods Designs. In: Wagemann, C., Goerres, A., Siewert, M.B. (eds) Handbuch Methoden der Politikwissenschaft. Springer VS, Wiesbaden. https://doi.org/10.1007/978-3-658-16936-7_12

Download citation

DOI : https://doi.org/10.1007/978-3-658-16936-7_12

Published : 28 November 2020

Publisher Name : Springer VS, Wiesbaden

Print ISBN : 978-3-658-16935-0

Online ISBN : 978-3-658-16936-7

eBook Packages : Social Science and Law (German Language)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Intersecting Mixed Methods and Case Study Research: Design Possibilities and Challenges

  • Related Documents

Qualitative Case Study Research Design

Qualitative case study research can be a valuable tool for answering complex, real-world questions. This method is often misunderstood or neglected due to a lack of understanding by researchers and reviewers. This tutorial defines the characteristics of qualitative case study research and its application to a broader understanding of stuttering that cannot be defined through other methodologies. This article will describe ways that data can be collected and analyzed.

Robert K. Yin. (2014). Case Study Research Design and Methods (5th ed.). Thousand Oaks, CA: Sage. 282 pages.

Dynamics of informal learning in two local markets in ile-ife, southwest nigeria.

In this chapter, the authors report on a recently concluded research study of the nature of adult informal learning in two local markets in Ile-Ife, Nigeria. Through case study research design, the authors explore what adult buyers and sellers learn as they interact in two local markets, who learned from who, and how they acquire the specific learning experiences identified. They examine the factors that drove learning and provide an explanation, a substantive theory of informal learning in the two local markets, which they name Communication, Value, and Profit (CVP).

In This Issue: Innovations in Mixed Methods—Causality, Case Study Research With a Circular Joint Display, Social Media, Grounded Theory, and Phenomenology

Mixed methods single case study research (mmscr): challenges in wp evaluation.

Evaluating widening participation (WP) interventions is complex. Early efforts at WP evaluation were criticised for lacking rigour. These criticisms were accompanied with suggested approaches to research, typically favouring randomised control trials. Yet these recommendations have, in turn, become the focus of much discussion and debate within the WP evaluation sector.<br/> This paper presents the use of a 'mixed methods single case study research' (MMSCR) study design to WP evaluation. It describes the work of a PhD researcher evaluating the school – university partnership science outreach programme between the Wohl Reach Out Lab at Imperial College London and a local secondary school.<br/> The article highlights potential challenges when using MMSCR, namely in ensuring internal validity and trustworthiness of the study. Solutions to these challenges are presented and the case is made for broadening what is seen as meaningful research in the sector.

Case Study Research: Design and Methods by YIN, ROBERT K

Designing embedded case study research approach in educational research.

Despite the case study research method has been widely adopted in qualitative research, few scholarly articles addressed the comprehensive guidance on the use of embedded case study research design. This paper aims to contribute to the literature by demonstrating the use of embedded case study research design in qualitative research. A pseudo case was exemplified by exploring the relationship between a holding company and its subsidiary companies of a corporate group. What construct a case and the rationale for the case being studied is exemplified. The paper further outlines the research protocol, the procedure of inquiry, and the design of the embedded case analysis. A brief explanation of the context of the case enriches the understanding of the investigated cases.

THE IMPLICATURE OF TEMBANG GAMBUH IN SERAT WEDHATAMA AND ITS SIGNIFICANCE FOR THE SOCIETY

This study was aimed to discover the meanings implied in the texts of Tembang Gambuh written by KGPAA Mangkunagara IV in his Serat Wedhatama as well as its significance for the society. The texts were originally composed of 25 stanzas, but this study was limited to those associated with sembah (worship). This study employed qualitative paradigm which was conducted by using an embedded-case-study research design. Data was collected from informants, places and events as well as documents/archives or library sources. The validity of the data was tested using the data triangulation. The data was then analyzed by using a model of pragmatic analysis. The findings based on the pragmatic analysis suggested that the dominant implicatures in the Tembang Gambuh in relation to sembah were sembah raga (worship by physical conduct), sembah cipta (worship by controlled/mind conduct), sembah jiwa (spiritual/soul worship), and sembah rasa (worship beyond rituals). For the society, these four types of worship can serve as a warning and moral education that have significant values for someone to draw closer to God by performing worship in accordance to the guideline.

Redefining Case Study

In this paper the authors propose a more precise and encompassing definition of case study than is usually found. They support their definition by clarifying that case study is neither a method nor a methodology nor a research design as suggested by others. They use a case study prototype of their own design to propose common properties of case study and demonstrate how these properties support their definition. Next, they present several living myths about case study and refute them in relation to their definition. Finally, they discuss the interplay between the terms case study and unit of analysis to further delineate their definition of case study. The target audiences for this paper include case study researchers, research design and methods instructors, and graduate students interested in case study research.

Analisis Efisiensi Saluran Pemasaran Ikan Nila di Desa Kupang Kecamatan Lampihong Kabupaten Balangan (Studi Kasus Pada Kelompok Perikanan Kupang Maju)

       This study aims to determine the marketing channels and institutions involved in the marketing of tilapia in Kupang Village, Lampihong Subdistrict, Balangan Regency, to know the structure of the tilapia market, analyze costs, profits, marketing margins and the price portion received by producers. This research was done intentionally by the census method and case study research design and data collection by interview. The results showed four marketing channels for tilapia, namely I: cultivators and collectors, II: cultivators, collectors, and retailers, III: cultivators, collectors and wholesalers and IV: farmers, traders, wholesalers and retailers. Market structure is monolithic competition. The total marketing channel I channel is IDR 11,000 per kg, the portion of the price received by farmers is 67.64%, the marketing cost is IDR 1,335 per kg and the profit gained is IDR 9,665 per kg; channel II's total marketing margin is Rp. 17,000 per kg, the portion of the price received by farmers is 57.5%, the marketing cost is Rp. 3,785 per kg, and the profit received is 13,215 per kg; total channel III marketing margin is Rp. 17,500 per kg, part of the price received by farmers is 56.79%, marketing costs Rp. 3,192.5 per kg and profits received Rp. 14.307.5 per kg; total channel IV marketing margin is Rp. 15,000 per kg, part of the price received by farmers is 60.52%, marketing costs Rp. 6,385 per kg and profits received 9,855 per kg. Channel I efficiency value: 3.92%, II; 9.46%, III: 7.88%, and channel IV, which is 16.80%. Channel I is the most efficient because the marketing institution is smaller, namely 3.92%.

Export Citation Format

Share document.

  • Study protocol
  • Open access
  • Published: 20 August 2013

A mixed methods multiple case study of implementation as usual in children’s social service organizations: study protocol

  • Byron J Powell 1 ,
  • Enola K Proctor 1 ,
  • Charles A Glisson 2 ,
  • Patricia L Kohl 1 ,
  • Ramesh Raghavan 1 , 3 ,
  • Ross C Brownson 1 , 4 ,
  • Bradley P Stoner 5 , 6 ,
  • Christopher R Carpenter 7 &
  • Lawrence A Palinkas 8  

Implementation Science volume  8 , Article number:  92 ( 2013 ) Cite this article

24k Accesses

29 Citations

Metrics details

Improving quality in children’s mental health and social service settings will require implementation strategies capable of moving effective treatments and other innovations ( e.g ., assessment tools) into routine care. It is likely that efforts to identify, develop, and refine implementation strategies will be more successful if they are informed by relevant stakeholders and are responsive to the strengths and limitations of the contexts and implementation processes identified in usual care settings. This study will describe: the types of implementation strategies used; how organizational leaders make decisions about what to implement and how to approach the implementation process; organizational stakeholders’ perceptions of different implementation strategies; and the potential influence of organizational culture and climate on implementation strategy selection, implementation decision-making, and stakeholders’ perceptions of implementation strategies.

Methods/design

This study is a mixed methods multiple case study of seven children’s social service organizations in one Midwestern city in the United States that compose the control group of a larger randomized controlled trial. Qualitative data will include semi-structured interviews with organizational leaders ( e.g ., CEOs/directors, clinical directors, program managers) and a review of documents ( e.g ., implementation and quality improvement plans, program manuals, etc.) that will shed light on implementation decision-making and specific implementation strategies that are used to implement new programs and practices. Additionally, focus groups with clinicians will explore their perceptions of a range of implementation strategies. This qualitative work will inform the development of a Web-based survey that will assess the perceived effectiveness, relative importance, acceptability, feasibility, and appropriateness of implementation strategies from the perspective of both clinicians and organizational leaders. Finally, the Organizational Social Context measure will be used to assess organizational culture and climate. Qualitative, quantitative, and mixed methods data will be analyzed and interpreted at the case level as well as across cases in order to highlight meaningful similarities, differences, and site-specific experiences.

This study is designed to inform efforts to develop more effective implementation strategies by fully describing the implementation experiences of a sample of community-based organizations that provide mental health services to youth in one Midwestern city.

Peer Review reports

Children in the U.S. continue to receive substandard mental health and child welfare services [ 1 – 4 ], partly because we do not understand how to effectively integrate evidence-based treatments (EBTs) into ‘real world’ service settings. Evidence-based treatments are seldom implemented, and when they are, problems with implementation can severely diminish their impact [ 5 ]. To improve the quality of care for children, EBTs will need to be complemented by evidence-based approaches to implementation [ 6 ]. Thus, the National Institutes of Health and the Institute of Medicine have prioritized efforts to identify, develop, refine, and test implementation strategies [ 7 , 8 ], which are defined as ‘systematic intervention processes to adopt and integrate evidence-based health innovations into usual care’ [ 9 ].

State of the evidence for implementation strategies

While the health and mental health literatures describe many potentially promising implementation strategies [ 9 ], the evidence of their effectiveness remains imperfect [ 10 – 13 ]. Most strategies deliver only modest effect sizes [ 10 ], and are effective under some, but not all, conditions [ 14 ]. Passive strategies, such as disseminating educational materials and continuing education courses, may be useful in increasing knowledge, but are generally not sufficient to change provider behavior [ 15 – 18 ]. Training approaches that incorporate ongoing supervision and consultation can lead to therapist behavior change [ 15 , 18 ], but it is increasingly recognized that strategies need to move beyond focusing solely on provider level factors such as knowledge and expertise [ 19 – 21 ]. Indeed, implementing EBTs with fidelity does not always improve outcomes [ 22 ], suggesting that other barriers to quality service provision must also be addressed [ 23 ]. Implementation is a complex, multi-level process, and existing theoretical and empirical work suggests that ‘best practices’ in implementation would involve the planned use of multiple strategies to address barriers to change that can emerge at all levels of the implementation context [ 9 , 20 , 21 , 24 – 28 ]. There are a number of strategies that extend beyond the provider level [ 9 ]; however, in social services research, there are very few randomized studies that test the effectiveness of multi-level implementation strategies (for one exception, see [ 23 ]). More research is needed to develop effective ways of tailoring strategies to target implementation barriers [ 29 ] and to develop innovative strategies that are efficient, cost-effective, and robust or readily adaptable [ 30 ].

The need for a better understanding of implementation as usual

Implementation scientists cannot develop these strategies ‘in a vacuum’ [ 31 ]; they must possess a thorough understanding of the service systems and organizational contexts in which these strategies will (hopefully) be adopted [ 32 ]. Hoagwood and Kolko warn that ‘it is difficult and perhaps foolhardy to try to improve what you don’t understand’ [ 31 ], and note that program implementers and services researchers are often unable to anticipate implementation challenges largely because the context of service delivery has not been adequately described. In other words, there is a need for a better understanding of usual care settings, and in particular, what constitutes ‘implementation as usual’.

Garland et al. acknowledge that ‘studies that “simply” characterize existing practice may not be perceived as innovative or exciting compared to studies that test new innovations’ [ 33 ]. However, these studies are ‘a necessary complement – if not precursor’ – to studies that will strengthen knowledge on the implementation of EBTs [ 31 ]. Indeed, an increased understanding of implementation as usual has the potential to identify leverage points for implementation, specify targets for improvement, and generate useful insights into the types of implementation processes that are likely to be successful in the real world.

At present, very little is known about the implementation processes that occur in usual care [ 31 , 33 , 34 ]. This highlights the need for descriptive studies that define the range and context of current implementation processes in relation to what is known about ‘best implementation practice’ [ 35 ], which (for the purpose of this study) is characterized as the planned use of multiple strategies to address barriers to change at various levels [ 20 , 26 , 28 , 36 ]. The current study addresses this need by leveraging a control group of a larger implementation trial that is not receiving an active implementation intervention. Using control groups to examine implementation as usual may yield critical information that can be used to improve the development of implementation strategies. This approach maximizes the use of research funding, illuminates implementation processes within control conditions that may be helpful in understanding the results of larger trials, and ultimately, avoids treating control conditions as ‘black boxes’ that are assumed to have no ‘action’ related to treatment and implementation decisions and processes. The last point constitutes a considerable advantage over studies that focus solely on outcomes obtained by control groups thought to represent ‘usual care’ without generating rich descriptions of what actually occurs in these settings. This study will describe four elements of these organizations that may play a role in determining implementation, service system, and clinical outcomes [ 37 ]: patterns of implementation strategy use, implementation decision-making, perceptions of implementation strategies, and organizational social context.

Implementation strategy patterns

There is a paucity of descriptive data pertaining to basic contextual elements of implementation such as organizational operations, staffing patterns, and electronic technologies for tracking service visits in usual care settings [ 31 ]. Even less is known about implementation strategy patterns in children’s social service organizations. One exception is Schoenwald and colleagues’ examination of organizations’ use of training, supervision and evaluation [ 34 ]. Encouragingly, they found that training and supervisory practices were more or less ‘in line’ with the typical procedures in an effectiveness trial. However, there has yet to be a study that maps a fuller range of potential implementation strategies that extends beyond commonly used strategies such as training and supervision [ 9 ]. Thus, very little is known about the types of strategies employed, the frequency and intensity at which they are used, and the conceptual domains and levels of the implementation context that they target.

Organizational decision-making related to implementation processes

Organizational leaders face tremendous challenges when it comes to determining which treatments will be implemented in their settings and how they will be implemented. As Ferlie notes, ‘implementation process is often emergent, uncertain, and affected by the local context and features of action’ [ 38 ]. It would be ideal if organizational leaders would base their decisions upon the latest theoretical and empirical findings; however, little is written about how organizational leaders approach implementation decision-making. In particular, we need to know more about whether and how organizational leaders use research related to management and implementation, and the conditions under which they may be more likely to use research [ 38 ]. Furthermore, there is a need for more insight into the types ( e.g ., summaries of implementation barriers and facilitators, reviews of implementation strategies), formats ( e.g ., statistical or narrative summaries), and sources ( e.g ., academics, peers from other organizations) of information that organizational leaders find most valuable when making decisions about how to implement EBTs. This will highlight the ways in which implementation research could be made more accessible to organizational leaders, and could inform the development of decision aids that could facilitate the identification, selection, and tailoring of implementation strategies.

Stakeholders’ perceptions of the characteristics of implementation strategies

The characteristics of interventions may play a large role in determining whether or not they are adopted and sustained in the real world [ 26 , 39 , 40 ]. Rogers’ diffusion of innovations theory suggests that innovative treatment models will not likely be adopted unless they are: superior to treatment as usual; compatible with agency practices; no more complex than existing services; easy to try (and reject if it fails); and likely to produce tangible results recognizable by authorities [ 40 , 41 ]. Other potentially influential characteristics of interventions specified in theoretical models include the intervention source ( i.e ., the legitimacy of the source and whether it was internally or externally developed), evidence strength and quality, adaptability, design quality and packaging, and costs [ 26 ]. While these characteristics are often considered in relation to clinical interventions, they also readily apply to implementation strategies. In fact, a better understanding of stakeholders’ perceptions of implementation strategies may facilitate the process of identifying, developing, and selecting strategies that will be feasible and effective in the real world.

Influence of organizational culture and climate on implementation processes

The conceptual and empirical literatures have underscored the importance of organizational factors such as culture and climate in facilitating or impeding the uptake of innovations [ 24 , 26 , 42 – 44 ]. ‘Organizational culture’ is what makes an organization unique from others, including its core values and its organizational history of adapting with successes and failures [ 42 ]. It involves not only values and patterns related to products and services, but also how individuals within an organization treat and interact with one another [ 42 ]. Glisson and colleagues write, ‘Culture describes how the work is done in the organization and is measured as the behavioral expectations reported by members of the organization. These expectations guide the way work is approached and socialize new employees in the priorities of the organization’ [ 43 ]. Thus, culture is passed on to new employees and is conceptualized as a rather stable construct that is difficult to change. ‘Organizational climate’ is formed when employees have shared perceptions of the psychological impact of their work environment on their own well-being and functioning in the organization [ 43 ].

More constructive or positive organizational cultures and climates are associated with more positive staff morale [ 45 ], reduced staff turnover [ 46 ], increased access to mental health care [ 47 ], improved service quality and outcomes [ 45 , 48 , 49 ], greater sustainability of new programs [ 46 ], and more positive attitudes toward EBTs [ 50 ]. Yet, it is less clear how culture and climate relate to implementation processes. Knowing more about this relationship would inform efforts to facilitate organizational change. For example, it may be that organizations with poor cultures and climates require more intensive implementation support in order to develop well-coordinated implementation plans that address relevant determinants of practice [ 51 ].

This mixed methods multiple case study addresses these gaps in knowledge related to implementation contexts and processes in children’s social service organizations through the following aims:

Aim 1: To identify and characterize the implementation strategies used in community-based children’s social service settings;

Aim 2: To explore how organizational leaders make decisions about which treatments and programs to implement and how to implement them;

Aim 3: To assess stakeholders’ (organizational leaders’ and clinicians’) perceptions of the effectiveness, relative importance, acceptability, feasibility and appropriateness of implementation strategies; and

Aim 4: To examine the relationship between organizational context (culture and climate) and implementation strategy selection, implementation decision-making, and perceptions of implementation strategies.

Aim 1 will rely upon semi-structured interviews with organizational leaders (management and clinical directors) and document review to yield rich descriptions of the implementation strategies employed by seven agencies. This data will be compared to ‘best practices’ in implementation derived from existing theoretical and empirical work [ 11 , 13 , 15 , 18 , 36 ] to inform future work developing strategies in areas that are currently poorly addressed. It will also allow researchers and administrators to build upon ‘practice-based evidence’ and the strengths of ‘positive deviants’ ( i.e ., organizations that are consistently effective in implementing change despite a myriad of implementation barriers) [ 52 , 53 ].

Aim 2 will also use semi-structured interviews with organizational leaders and document review to generate new knowledge about how agency leaders use evidence and other sources of information to make decisions about implementation. Learning more about the type of information that organizational leaders seek, the sources they look to for that information, and the conditions under which they seek that information, may inform future work to make implementation science findings more accessible and ensure that implementation decision-making is based upon the best available theoretical and empirical knowledge in the field.

Aim 3 will utilize focus groups and an online survey to ensure that future work to develop and test implementation strategies will be informed by stakeholders’ (organizational leaders’ and clinicians’) perceptions about the types of strategies that are likely to be effective in the real world.

Aim 4 will examine how organizational social context (culture and climate) facilitates or hinders implementation by linking the data about strategy selection, implementation decision-making, and stakeholders’ perceptions of implementation strategies to organizations’ scores on a standardized measure of culture and climate [ 43 ].

Guiding conceptual frameworks

The proposed study is informed by two conceptual frameworks: the consolidated framework for implementation research CFIR [ 26 ] and Grol and Wensing’s implementation of change model [ 36 ]. These models will be integrated in all stages of the research process, including conceptualization ( e.g ., selecting implementation processes on which to focus), data collection ( e.g ., using components of the conceptual models as interview questions and probes), analysis ( e.g ., determining how comprehensively organizations are addressing constructs essential to implementation success, comparing ‘implementation as usual’ to ‘best practices’), and dissemination ( e.g ., framing findings conceptually so that they will be comparable to other implementation studies).

The CFIR was developed for the purpose of serving as a common reference to the many constructs that have been identified as important to implementation success [ 26 ]. It identifies five major domains related to implementation, including: intervention characteristics, the outer setting, the inner setting, the characteristics of the individuals involved, and the process of implementation. Detailed definitions of the 39 constructs included in the CFIR can be found in the supplementary materials associated with that article [ 26 ]. It captures the complex, multi-level nature of implementation, and suggests that successful implementation may necessitate the use of an array of strategies that target multiple levels of the implementation context [ 9 ]. The CFIR has informed the semi-structured interview guide (see Additional file 1 ) by specifying specific probes for eliciting descriptions of implementation strategies across various ‘levels’. It will also be used to assess the comprehensiveness of organizations’ approaches to implementation. For example, an organization that focuses only on the ‘characteristics of individuals’ while neglecting other domains such as ‘intervention characteristics’ or the ‘inner setting’ would have a less comprehensive approach to implementation than an organization that addresses all three (or more) of those domains.

Grol and Wensing’s implementation of change model informs this research by specifying a process of implementation that begins with identifying problems or gaps in care, identifying ESTs or other best-practices, carefully planning the implementation effort, developing a proposal with targets for improvement or change, analyzing current performance, developing implementation strategies, executing the implementation plan, and continuously evaluating and (if necessary) adapting the plan [ 36 ]. The model provides a structure and a process to implementation that the CFIR lacks. It also emphasizes an important aspect of implementation ‘best practice, ’ namely, that while implementation processes may be complex, necessitating iterative and flexible approaches [ 54 , 55 ], they should be planned and deliberate rather than haphazard. The implementation of change model has also informed the development of the interview guide informing Aims 1 and 2.

This study employs a mixed methods multiple case study design, in which each participating organization (n = 7) is conceptualized as a ‘case’ [ 56 , 57 ]. Case studies are particularly helpful in understanding the internal dynamics of change processes, and including multiple cases capitalizes on organizational variation and permits an examination of how contextual factors influence implementation [ 58 ]. Leaders in the field have emphasized the importance of using case study and other mixed methods observational designs to develop a more nuanced, theoretically informed understanding of change processes [ 59 – 64 ]. The study relies upon the ‘sequential collection and analysis of qualitative and quantitative data, beginning with qualitative data, for the primary purpose of exploration and hypothesis generation, ’ or a QUAL → quan approach [ 64 ]. This serves the primary function of ‘development, ’ as collecting qualitative data in Aims 1 to 3 affords the opportunity to examine the impact of organizational context in Aim 4 [ 64 ]. It serves the secondary function of ‘convergence’ by using quantitative and qualitative data to answer the same questions in Aim 3 [ 64 ].

The study will be conducted in the control arm of a U.S. National Institute of Mental Health funded randomized controlled trial (RCT) [ 65 ] of the Availability, Responsiveness, and Continuity (ARC) organizational implementation strategy [ 23 , 49 , 66 ], which affords a unique opportunity to study implementation as usual. The sample includes seven children’s social service organizations in a Midwestern city that reflect the characteristics of children’s mental health service providers nationwide [ 34 ] in that they are characterized by nonprofit organizational structures, they employ therapists that have master’s and bachelor’s degrees, and are comprised of a predominantly social work staff.

All participating organizations may not be currently implementing EBTs; however, they will likely be able to discuss strategies they have used to implement other clinical programs, services, or treatment models [ 46 ]. Thus, we will maintain an inclusive stance toward the types of programs and practices that organizations are implementing. This is warranted given that the primary scientific objective is to learn more about the processes and contexts of implementation rather than the particulars of implementing a specific EBT or class of EBTs.

While sampling logic should not be used in multiple case study research [ 57 , 67 ], seven cases are expected to be enough to ‘replicate’ findings across cases [ 57 ]. Yin writes that each ‘case’ (organization) is in essence treated as a separate study that either predicts similar results (literal replication) or predicts contrasting results but for anticipatable reasons (theoretical replication) [ 57 ]. In the present study, organizations with the worst cultures and climates may be expected to demonstrate similar implementation processes and perceptions of strategies ( i.e ., literal replication), whereas organizations with more positive cultures and climates may embrace a much different set of implementation processes and perceptions of strategies ( i.e ., theoretical replication).

Data collection

The proposed study will rely upon qualitative data from semi-structured interviews (Aims 1, 2, and 4), document review (Aims 1, 2, and 4), and focus groups (Aim 3). Additionally, quantitative data from a project-specific survey being developed (described below) and the Organizational Social Context (OSC) measure [ 43 ] will be used to accomplish Aims 3 and 4 respectively (see Table  1 ).

Qualitative data collection

Semi-structured interviews.

Semi-structured interviews will be conducted with organizational leaders ( e.g ., management and clinical supervisors) from each participating organization. The interviews will explore the implementation strategies their agencies have employed within the past year (Aim 1) and their approach to implementation decision-making (Aim 2). Interviews will be conducted by the lead author and will be structured by an interview guide (Additional file 1 ) informed by a review of implementation strategies [ 9 ] and the guiding conceptual models [ 26 , 36 ]. Specifically, the interview guide contains questions and prompts that will encourage participants to consider the implementation strategies that their organization has employed at multiple levels of the implementation context as specified by the CFIR [ 26 ] and the Powell et al. taxonomy [ 9 ] ( e.g ., asking if their organization used strategies related to the intervention, the policy or inter-organizational level, and the organization’s structure and functioning in addition to more commonly considered individual-level and process-level strategies). Through the process of snowball sampling [ 68 ], each participant will be asked to identify other employees who possess the requisite knowledge and experience to inform the study’s objectives. It is estimated that each organization will identify between three and five key informants, resulting in approximately 21 to 35 total interviews. Many agencies may not have more than this number of individuals who have direct knowledge of the use of implementation strategies [ 69 ], and more importantly, the decision-making processes surrounding implementation.

Guest and colleagues emphasize that very small samples can yield complete and accurate information as long as the respondents have the appropriate amount of expertise in the domain of inquiry [ 70 ]. Further, a main benefit of the multiple case study design is obtaining different sources of information that will be used to triangulate the interview data [ 57 , 64 ]. Interviews will last 60 to 90 minutes and will be digitally recorded. Immediately following each interview, the interviewer will complete field notes that will capture the main themes of the interview and any information that is pertinent to the study aims [ 71 , 72 ]. Interviews and field notes will be transcribed, and entered into NVivo, version 10, for data analysis.

Document review

The study will also involve a review of publically available and organization-provided documents. Organizational leaders will be asked to provide access to any documents that describe and formalize implementation processes. For example, these processes may be captured in notes from a board meeting in which the implementation of a new program or practice was discussed, or in an organization’s response to a request for proposals that seeks funding for a particular training or implementation related resource. Other documents may include (but are certainly not limited to) formal implementation or quality improvement plans, annual reports, and program manuals. These sources will serve to augment or triangulate interview respondents’ descriptions of implementation strategies and decision-making processes. With permission from the organizations, potentially useful documents will be obtained and entered into NVivo, version 10, for analysis.

Focus groups interviews

Focus groups involving approximately four to eight clinicians (or direct care staff members) will be conducted in each participating organization to capture the depth and nuances of their perceptions of strategies. The number of participants per focus group is consistent with Barbour’s recommendation of a minimum of three or four participants and a maximum of eight [ 73 ]. The number of focus groups (one per agency) is appropriate because the relatively homogenous population ( e.g ., clinicians at a given agency) and the structured and somewhat narrow scope of inquiry reduces the number of individuals needed to reach saturation [ 70 ]. Further, the quantitative data will serve to triangulate the focus group data [ 57 , 64 ], reducing the need for a larger sample size. The focus groups will be conducted by the first author and a research assistant. The interview will be guided by a structured interview guide (Additional file 2 ) informed by a conceptual taxonomy of implementation outcomes [ 74 ]. Participants will be asked to discuss the implementation strategies that they have used at their organization, and the facilitator(s) will record each strategy mentioned on a whiteboard so that all participants can see the running list. Additional strategies drawn from the literature may be listed if the participants focus on a relatively narrow range of strategies. Participants will then be asked to reflect upon the effectiveness, acceptability, feasibility, and appropriateness of the listed strategies. Although the primary purpose of the focus group interviews is to assess participants’ perceptions of various implementation strategies, it is also possible that these individuals will provide information about implementation strategies used at their organization that were not captured in the semi-structured interviews with organizational leaders.

Each focus group will last approximately 60 to 90 minutes and will be digitally recorded. As with the individual interviews, the interviewer will complete field notes following the focus groups that will document the main themes of the session and any observations pertinent to the study aims. The interviews and the field notes will be transcribed and entered into NVivo, version 10, for analysis.

Quantitative survey data

Survey of stakeholders’ perceptions of implementation strategies.

A project-specific self-administered web-based survey will be developed to assess stakeholders’ perceptions and experiences with specific implementation strategies. The implementation strategies included in the survey will be generated from the qualitative work in Aims 1, 2, and 3 and a published ‘menu’ that describes 68 distinct implementation strategies [ 9 ]. In order to ensure a relatively low burden to respondents, it is unlikely that more than 40 strategies will be included. Decisions about the inclusion of strategies will be driven by the qualitative analysis ( i.e ., using the strategies mentioned by organizational leaders and clinicians), while attempts will be made to include strategies that address a number of different targets as specified in the CFIR [ 26 ].

It should also be noted that the Powell and colleagues’ compilation includes a number of strategies that could not be reasonably adopted by the participants of this study ( e.g ., ‘centralize technical assistance’) [ 9 ], and those strategies will be eliminated. The survey will also be informed by a conceptual taxonomy of implementation outcomes [ 74 ] and other existing surveys drawn from implementation science measures collections [ 75 , 76 ]. In addition to basic demographic questions, stakeholders will be asked whether or not they have experienced each included implementation strategy (yes or no) and will then rate each strategy (using a Likert-style scale) on the following dimensions: ‘effectiveness’ and ‘relative importance’ ( i.e ., How well did it work and how important was it relative to other strategies?), ‘acceptability’ ( i.e ., How agreeable, palatable, or satisfactory is the strategy?), ‘feasibility’ ( i.e. , the perception that the strategy has been or could be successfully used within a given setting), and ‘appropriateness’ ( i.e ., the perceived fit, relevance, or compatibility of the strategy with the setting). This survey will be administered via an email with a link to the online survey, and will be pilot tested to ensure face-validity and ease of use.

Organizational social context (OSC) survey

The OSC is a standardized measure that assesses organizational culture, climate, and work attitudes (the latter of which is not being used for the current study) using 105 Likert-style items [ 43 ]. Culture is assessed in terms of an organization’s level of ‘rigidity’ (centralization, formalization), ‘proficiency’ (responsiveness, competence), and ‘resistance’ (apathy, suppression). The ‘best’ organizational cultures are highly proficient and not very rigid or resistant, while the ‘worst’ cultures are not very proficient and are highly rigid and resistant to change or new ideas. Climate is assessed with three second-order factors: ‘engagement’ (personalization, personal accomplishment), ‘functionality’ (growth and achievement, role clarity, cooperation), and ‘stress’ [ 43 ]. The ‘best’ organizational climates are described as being highly engaged, highly functional, and low in stress [ 43 ]. Cronbach’s alphas for the OSC subscales (rigidity, proficiency, resistance, stress, engagement, functionality) range from 0.78 to 0.94. The OSC will be administered on site, and a research assistant will assure respondents that their responses will remain confidential.

Data analysis

Qualitative data analysis.

Qualitative data from semi-structured interviews, document review, and focus groups will be imported and analyzed (separately) in NVivo using qualitative content analysis [ 77 – 80 ], which has been used successfully in similar studies [ 81 – 83 ]. Content analysis enables a theory driven approach, and an examination of both manifest ( i.e ., the actual words used) and latent ( i.e ., the underlying meaning of the words) content [ 72 ]. Accordingly, analysis will be informed by the guiding conceptual models, with additional patterns, themes, and categories being allowed to emerge from the data [ 72 , 84 ]. The first author and a doctoral student research assistant will independently co-code a sample of the transcripts to increase reliability and reduce bias [ 72 , 85 ]. Both coders will participate in a frame-of-reference training to ensure a common understanding of the core concepts related to the research aims [ 82 ]. Disagreements will be discussed and resolved through consensus. Initially, the coders will review the transcripts to develop a general understanding of the content. ‘Memos’ will be generated to document initial impressions and define the parameters of specific codes. Next, the data will be condensed into analyzable units (text segments), which will be labelled with codes based on a priori ( i.e ., derived from the interview guide or guiding theories) or emergent themes that will be continually refined and compared to each other. For instance, the implementation of change model [ 36 ] will be used to develop a priori codes such as ‘identifying programs and practices’ or ‘planning’ related to implementation decision-making. The CFIR [ 26 ] will be used in a similar fashion by contributing a priori codes that will serve to distinguish different types of implementation strategies, such as strategies that focus on the ‘inner setting’ or the ‘outer setting’. Finally, the categories will be aggregated into broader themes related to implementation strategy patterns, implementation decision-making, and stakeholders’ perceptions of strategies.

The use of multiple respondents is intentional, as some individuals may be more or less knowledgeable about their organization’s approach to implementation; however, it is possible that participants from a given agency may not endorse the use of the same strategies [ 86 ]. The approach to handling such ‘discrepancies’ will be one of inclusion, in that each unique strategy endorsed will be recorded as ‘in use’ at that agency (for an example of this approach, see Hysong et al. [ 82 ]). If participants’ responses regarding strategies vary widely within a given organization, it may be indicative of a lack of a coherent or consistent strategy [ 86 ]. The use of mixed methods and multiple sources of data will allow us to make sense of reported variation in strategy use by affording the opportunity to determine the extent to which these sources of data converge [ 57 , 64 , 86 , 87 ]. The use of multiple respondents and different sources of data also reduces the threat of bias that is sometimes associated with the collection of retrospective accounts of phenomena such as business strategy [ 69 ].

Quantitative data analysis

The developed survey capturing stakeholders’ perceptions of implementation strategies will yield descriptive data that will augment the qualitative data from semi-structured interviews, document review, and focus groups. In the cross-case analysis, this data will be compared to determine differences and similarities between cases. Data will also be pooled across all seven cases to reveal an overall picture of strategy use, as well as perceived effectiveness, relative importance, acceptability, feasibility, and appropriateness of implementation strategies.

Results from the OSC measure will be analyzed and interpreted in consultation with its developer according to procedures described by Glisson et al . [ 43 ]. Scoring will be completed at the University of Tennessee’s Children’s Mental Health Services Research Center, including the generation of internal reliability estimates (alpha), agreement indices for organizational unit profiles, and t-scores for culture and climate. The resulting organizational profiles can be compared to norms from a nationwide sample of 1,154 clinicians in 100 mental health clinics, which affords the opportunity to determine the generalizability of study findings beyond the selected sites. The OSC data will serve to characterize the organizations’ culture and climate in individual case descriptions. Additionally, organizations will be stratified by their OSC profiles in order to differentiate more positive cultures (highly proficient and not very rigid or resistant) and climates (highly engaged, highly functional, low stress) from less positive cultures (low proficiency, highly rigid and resistant) and climates (low engagement and functionality, high stress) [ 43 ]. Qualitative results will then be categorized according to those OSC profiles to determine whether strategy patterns, approaches to decision-making, and perceptions of strategies vary by organizational culture and climate.

Mixed methods analysis

As previously mentioned, the structure of this study is QUAL → quan, meaning that qualitative methods precede quantitative and that they are predominant [ 64 , 88 ]. This serves the primary function of ‘development,’ as collecting qualitative data in Aims 1 to 3 affords the opportunity to examine the impact of organizational context in Aim 4. It also serves the function of ‘convergence’ by using quantitative and qualitative data to answer the same question in Aim 3 [ 64 ].

The processes of ‘mixing’ the qualitative and quantitative data flow directly from these functions. To serve the function of ‘development, ’ the quantitative data on organizational social context [ 43 ] is connected with the qualitative and quantitative results from Aims 1 to 3 regarding implementation strategy use, implementation decision-making, and stakeholder perceptions of implementation strategies [ 64 ]. Assuming there is a meaningful relationship between organizational social context and the data from Aims 1 to 3, this can be shown in a joint display [ 88 ] that categorizes the themes emerging from the qualitative and quantitative data based upon the OSC profiles [ 43 ] as described above. For example, a separate table may be used to show how implementation strategy patterns differ based upon organizational social context. Examples of this approach can be found in Killaspy et al . [ 89 ] and Hysong et al . [ 82 ], and are also detailed in Creswell and Plano-Clark’s methods book [ 88 ].

To serve the function of ‘convergence, ’ the qualitative data and quantitative data will be merged in order to answer the same question, which for Aim 3 is, ‘What are implementation stakeholders’ perceptions of implementation strategies’? These data are merged for the purpose of triangulation; in this case, to use the quantitative data from the stakeholder perceptions survey to validate and confirm the qualitative findings from the focus-group interviews. Once again, this process can be depicted through a table placing qualitative themes side by side with the quantitative findings to show the extent to which the data converges [ 88 ].

It is worth noting that the approaches to ‘mixing’ qualitative and quantitative data will be used at both the case-level and the cross-case level (as described below).

Cross-case analysis

A primary benefit of a multiple case study is the ability to make comparisons across cases. The proposed study will utilize cross-case synthesis [ 57 ], which treats individual cases as separate studies that are then compared to identify similarities and differences between the cases. This will involve creating word tables or matrices that will display the data according to a uniform framework [ 57 , 84 ]. For example, data from the first three aims (strategy patterns, implementation decision-making, and stakeholder perceptions) will be categorized based upon their OSC profiles [ 43 ] in Aim 4. This approach will be used to compare across cases for each of the proposed aims, allowing for meaningful similarities, differences, and site-specific experiences to emerge from the data [ 56 , 57 ].

Limitations

A number of limitations should be considered. There is some concern that the organizations in the sample will not be comparable since they will not all be implementing the same programs and practices. There are several protections against this danger. First, while there is evidence to suggest that specific programs and practices will require unique implementation strategies e.g ., [ 90 ], implementation strategies can also be viewed as more general components of an organization’s infrastructure [ 34 ]. In fact, this view of implementation strategies may become more salient as we begin to shift the focus away from implementing solitary practices and toward fostering evidence-based systems and “learning organizations” capable of implementing a number of EBTs well [ 91 ]. Obtaining descriptive data about the types of implementation strategies that organizations are currently using is a first step toward determining which strategies may need to be routinized in organizations and systems of care. Second, while they may not all be implementing the same interventions, the organizations in this sample are comparable in terms of client need, service provision, funding requirements, and other external or ‘outer setting’ factors [ 26 ]. Third, programs and practices can be compared in meaningful ways based upon their characteristics [ 39 , 40 , 92 ].

The cross-sectional nature of the data will not reveal how implementation processes change over time. Additionally, recall bias may limit the accuracy of participants’ memories of implementation processes. The use of multiple informants and data sources ( i.e ., triangulation) will increase the validity of findings and minimize the threat of this bias [ 57 , 58 ].

A final challenge is the lack of existing surveys that can assess stakeholder perceptions of strategies; however, the web-based survey will be informed by theories related to the intervention characteristics associated with increased adoption [ 26 , 39 , 40 ], related surveys [ 75 ], a taxonomy of implementation outcomes [ 74 ], and other emerging measurement models e.g ., [ 93 ].

Trial status

The Institutional Review Board at Washington University in St. Louis has approved all study procedures. Recruitment and data collection for this study began in March of 2013.

Improving the quality of children’s social services will require ‘making the right thing to do, the easy thing to do’ [ 94 ] by providing organizational leaders and clinicians with the tools they need to provide evidence-based care. In order for this to be accomplished, there is much we need to know about the approaches to implementation that routinely occur, the ‘on the ground’ perspectives of organizational stakeholders regarding the types of implementation strategies that are likely to work, and the ways in which organizational context impacts implementation processes. This study represents a novel approach to studying implementation as usual in the control group of an implementation RCT. By shedding light on ‘implementation as usual’ in children’s social service settings, this study will inform efforts to develop and tailor strategies, propelling the field toward the ideal of evidence-based implementation.

Abbreviations

Availability Responsiveness and Continuity

Consolidated Framework for Implementation Research

Evidence-Based Treatments

National Research Service Award

National Institute of Mental Health

Organizational Social Context

randomized controlled trial

qualitative (dominate method)

quantitative (subordinate method).

Garland AF, Brookman-Frazee L, Hurlburt MS, Accurso EC, Zoffness RJ, Haine-Schlagel R, Ganger W: Mental health care for children with disruptive behavior problems: a view inside therapists’ offices. Psychiatr Serv. 2010, 61: 788-795.

Article   PubMed   PubMed Central   Google Scholar  

Kohl PL, Schurer J, Bellamy JL: The state of parent training: Program offerings and empirical support. Fam Soc. 2009, 90: 247-254.

Article   Google Scholar  

Raghavan R, Inoue M, Ettner SL, Hamilton BH: A preliminary analysis of the receipt of mental health services consistent with national standards among children in the child welfare system. Am J Public Health. 2010, 100: 742-749.

Zima BT, Hurlburt MS, Knapp P, Ladd H, Tang L, Duan N, Wallace P, Rosenblatt A, Landsverk J, Wells KB: Quality of publicly-funded outpatient specialty mental health care for common childhood psychiatric disorders in California. J Am Acad Child Adolesc Psychiatry. 2005, 44: 130-144.

Article   PubMed   Google Scholar  

Durlak JA, DuPre EP: Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008, 41: 327-350.

Grol R, Grimshaw JM: Evidence-based implementation of evidence-based medicine. Jt Comm J Qual Improv. 1999, 25: 503-513.

CAS   PubMed   Google Scholar  

Institute of Medicine: Initial National Priorities for Comparative Effectiveness Research. 2009, Washington, DC: The National Academies Press

Google Scholar  

Dissemination and implementation research in health (R01). http://grants.nih.gov/grants/guide/pa-files/PAR-13-055.html ,

Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC, Glass JE, York JL: A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012, 69: 123-157.

Grimshaw JM, Eccles M, Thomas R, MacLennan G, Ramsay C, Fraser C, Vale L: Toward evidence-based quality improvement: evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966–1998. J Gen Intern Med. 2006, 21 (2): S14-20.

PubMed   PubMed Central   Google Scholar  

Grol R, Wensing M, Eccles M: Improving patient care: The implementation of change in clinical practice. 2005, Edinburgh: Elsevier

Powell BJ, Proctor EK, Glass JE: A systematic review of implementation strategies in mental health service settings. 2011, Washington: Seattle

Straus S, Tetroe J, Graham ID: Knowledge translation in health care: Moving from evidence to practice. 2009, Hoboken, NJ: Wiley-Blackwell

Book   Google Scholar  

The Improved Clinical Effectiveness through Behavioural Research Group (ICEBeRG): Designing theoretically-informed implementation interventions. Implement Sci. 2006, 1: 1-8.

Beidas RS, Kendall PC: Training therapists in evidence-based practice: a critical review of studies from a systems-contextual perspective. Clin Psychol Sci Pract. 2010, 17: 1-30.

Davis DA, Davis N: Educational interventions. Knowledge translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Oxford, UK: Wiley-Blackwell, 113-123.

Herschell AD, McNeil CB, Urquiza AJ, McGrath JM, Zebell NM, Timmer SG, Porter A: Evaluation of a treatment manual and workshops for disseminating, Parent–child Interaction Therapy. Adm Policy Ment Health. 2009, 36: 63-81.

Herschell AD, Kolko DJ, Baumann BL, Davis AC: The role of therapist training in the implementation of psychosocial treatments: a review and critique with recommendations. Clin Psychol Rev. 2010, 30: 448-466.

Flanagan ME, Ramanujam R, Doebbeling BN: The effect of provider- and workflow-focused strategies for guideline implementation on provider acceptance. Implement Sci. 2009, 4: 1-10.

Solberg LI, Brekke ML, Fazio CJ, Fowles J, Jacobsen DN, Kottke TE, Mosser G, O’Connor PJ, Ohnsorg KA, Rolnick SJ: Lessons from experienced guideline implementers: attend to many factors and use multiple strategies. Journal on Quality Improvement. 2000, 26: 171-188.

CAS   Google Scholar  

Wensing M, Bosch M, Grol R: Selecting, tailoring, and implementing knowledge translation interventions. Knowledge Translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Oxford, UK: Wiley-Blackwell, 94-113.

Weisz JR, Chorpita BF, Palinkas LA, Schoenwald SK, Miranda J, Bearman SK, Daleiden EL, Ugueto AM, Ho A, Martin J, Gray J, Alleyne A, Langer DA, Southam-Gerow MA, Gibbons RD: Health TRN on YM: Testing standard and modular designs for psychotherapy treating depression, anxiety, and conduct problems in youth: a randomized effectiveness trial. Arch Gen Psychiatry. 2012, 69: 274-282.

Glisson C, Schoenwald S, Hemmelgarn A, Green P, Dukes D, Armstrong KS, Chapman JE: Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010, 78: 537-550.

Aarons GA, Hurlburt M, Horwitz SM: Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011, 38: 4-23.

Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA: Getting research findings into practice: closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. Br Med J. 1998, 317: 465-468.

Article   CAS   Google Scholar  

Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC: Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009, 4: 1-15.

Shortell SM: Increasing value: a research agenda for addressing the managerial and organizational challenges facing health care delivery in the United States. Medical Care Research Review. 2004, 61: 12S-30S.

Solberg LI: Guideline implementation: what the literature doesn’t tell us. Journal on Quality Improvement. 2000, 26: 525-537.

Baker R, Cammosso-Stefinovic J, Gillies C, Shaw EJ, Cheater F, Flottorp S, Robertson N: Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2010, CD005470-10.1002/14651858.CD005470.pub2.

Mittman BS: Criteria for peer review of D/I funding applications. 2010, Missouri: St. Louis

Hoagwood KE, Kolko DJ: Introduction to the special section on practice contexts: a glimpse into the nether world of public mental health services for children and families. Adm Policy Ment Health. 2009, 36: 35-36.

Proctor EK, Rosen A: From knowledge production to implementation: research challenges and imperatives. Res Soc Work Pract. 2008, 18: 285-291.

Garland AF, Bickman L, Chorpita BF: Change what? Identifying quality improvement targets by investigating usual mental health care. Adm Policy Ment Health. 2010, 37: 15-26.

Schoenwald SK, Chapman JE, Kelleher K, Hoagwood KE, Landsverk J, Stevens J, Glisson C, Rolls-Reutz J: A survey of the infrastructure for children’s mental health services: implications for the implementation of empirically supported treatments (ESTs). Adm Policy Ment Health. 2008, 35: 84-97.

Fixsen DL, Blase KA, Naoom SF, Wallace F: A National Plan of Implementation Research. 2005, Chapel Hill, N. C.: National Implementation Research Network

Grol R, Wensing M: Effective implementation: a model. Improving patient care: The implementation of change in clinical practice. Edited by: Grol R, Wensing M, Eccles M. 2005, Edinburgh: Elsevier, 41-57.

Proctor EK, Landsverk J, Aarons GA, Chambers DA, Glisson C, Mittman BS: Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health. 2009, 36: 24-34.

Ferlie E: Organizational interventions. Knowledge translation in health care: Moving from evidence to practice. Edited by: Straus S, Tetroe J, Graham ID. 2009, Oxford, UK: Wiley-Blackwell, 144-150.

Grol R, Bosch MC, Hulscher MEJ, Eccles MP, Wensing M: Planning and studying improvement in patient care: the use of theoretical perspectives. Milbank Q. 2007, 85: 93-138.

Rogers EM: Diffusion of Innovations. 2003, New York: Free Press, 5

Fraser MW, Richman JM, Galinsky MJ, Day SH: Intervention Research: Developing Social Programs. 2009, Oxford: New York

Aarons GA, Horowitz JD, Dlugosz LR, Ehrhart MG: The role of organizational processes in dissemination and implementation research. Dissemination and implementation research in health: Translating science to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, 128-153.

Chapter   Google Scholar  

Glisson C, Landsverk J, Schoenwald S, Kelleher K, Hoagwood KE, Mayberg S, Green P: Assessing the organizational social context (OSC) of mental health services: implications for research and practice. Adm Policy Ment Health. 2008, 35: 98-113.

Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004, 82: 581-629.

Glisson C: Assessing and changing organizational culture and climate for effective services. Res Soc Work Pract. 2007, 17: 736-747.

Glisson C, Schoenwald SK, Kelleher K, Landsverk J, Hoagwood KE, Mayberg S, Green P: Therapist turnover and new program sustainability in mental health clinics as a function of organizational culture, climate, and service structure. Adm Policy Ment Health. 2008, 35: 124-33.

Glisson C, Green P: The effects of organizational culture and climate on the access to mental health care in child welfare and juvenile justice systems. Adm Policy Ment Health. 2006, 33: 433-48.

Glisson C, Hemmelgarn A: The effects of organizational climate and interorganizational coordination on the quality and outcomes of children’s service systems. Child Abuse Negl. 1998, 22: 401-21.

Article   CAS   PubMed   Google Scholar  

Glisson C, Hemmelgarn A, Green P, Williams NJ: Randomized trial of the availability, responsiveness and continuity (ARC) organizational intervention for improving youth outcomes in community mental health programs. Journal of the American Academy of Child & Adolescent Psychiatry. 2013, 52: 493-500.

Aarons GA, Sawitzky AC: Organizational culture and climate and mental health provider attitudes toward evidence-based practice. Psychol Serv. 2006, 3: 61-72.

Flottorp SA, Oxman AD, Krause J, Musila NR, Wensing M, Godycki-Cwirko M, Baker R, Eccles MP: A checklist for identifying determinants of practice: a systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implement Sci. 2013, 8: 1-11.

Bradley EH, Curry LA, Ramanadhan S, Rowe L, Nembhard IM, Krumholz HM: Research in action: using positive deviance to improve quality of health care. Implement Sci. 2009, 4: 1-11.

Pascale R, Sternin J, Sternin M: The Power of Positive Deviance: How Unlikely Innovators Solve the World’s Toughest Problems. 2010, Boston, MA: Harvard Business Press

Aarons GA, Palinkas LA: Implementation of evidence-based practice in child welfare: service provider perspectives. Administrative Policy in Mental Health & Mental Health Services Research. 2007, 34: 411-419.

Pressman JL, Wildavsky A: Implementation: How Great Expectations in Washington Are Dashed in Oakland; or, Why It’s Amazing That Federal Programs Work at All, This Being a Saga of the Economic Development Administration as Told by Two Sympathetic Observers Who Seek to Build Morals on a Foundation of Ruined Hopes. 1984, Berkeley: University of California Press, 3

Stake RE: Multiple Case Study Analysis. 2005, New York: Guilford Press

Yin RK: Case Study Research: Design and Methods. 2009, Thousand Oaks, CA: Sage, [ Applied Social Research Methods Series ], 4

Wensing M, Eccles M, Grol R: Observational evaluations of implementation strategies. Improving patient care: The implementation of change in clinical practice. Edited by: Grol R, Wensing M, Eccles M. 2005, Edinburgh: Elsevier, 248-255.

Aarons GA, Fettes DL, Sommerfeld DH, Palinkas LA: Mixed methods for implementation research: application to evidence-based practice implementation and staff turnover in community-based organizations providing child welfare services. Child Maltreat. 2012, 17: 67-79.

Berwick DM: The science of improvement. JAMA. 2008, 299: 1182-1184.

Eccles MP, Armstrong D, Baker R, Cleary K, Davies H, Davies S, Gasziou P, Ilott I, Kinmonth ALL, Leng G, Logan S, Marteau T, Michie S, Rogers H, Rycroft-Malone J, Sibbald B: An implementation research agenda. Implement Sci. 2009, 4: 1-7.

Institute of Medicine: The State of Quality Improvement and Implementation Research: Workshop Summary. 2007, Washington, DC: The National Academies Press

Landsverk J, Brown CH, Reutz JR, Palinkas LA, Horwitz SM: Design elements in implementation research: a structured review of child welfare and child mental health studies. Adm Policy Ment Health. 2011, 38: 54-63.

Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J: Mixed methods designs in implementation research. Administration and Policy in Mental Health and Mental Health Services Research. 2011, 38: 44-53.

Glisson C, Proctor EK: Testing an Organizational Implementation Strategy in Children’s Mental Health. 2009, National Institute of Mental Health (R01 MH084855), http://projectreporter.nih.gov/project_info_description.cfm?aid=8460576&icde=17345134&ddparam=&ddvalue=&ddsub=&cr=1&csb=default&cs=ASC ,

Glisson C, Hemmelgarn A, Green P, Dukes D, Atkinson S, Williams NJ: Randomized trial of the availability, responsiveness, and continuity (ARC) organizational intervention with community-based mental health programs and clinicians serving youth. Journal of the American Academy of Child & Adolescent Psychiatry. 2012, 51: 780-787.

Small ML: “How many cases do I need?”: on science and the logic of case selection in field based research. Ethnography. 2009, 10: 5-38.

Marshall MN: Sampling for qualitative research. Fam Pract. 1996, 13: 522-525.

Golden BR: The past is the past - or is it? The use of retrospective accounts as indicators of past strategy. Acad Manage J. 1992, 35: 848-860.

Guest G, Bunce A, Johnson L: How many interviews are enough?: an experiment with data saturation and variability. Field Methods. 2006, 18: 59-82.

Denzin NK, Lincoln YS: The SAGE Handbook of Qualitative Research. 2011, Thousand Oaks, CA: Sage, 4

Bernard HR: Research Methods in Anthropology: Qualitative and Quantitative Approaches. 2011, Lanham, Maryland: AltaMira Press, 5

Barbour R: Doing Focus Groups. 2007, Thousand Oaks, CA: Sage

Proctor EK, Silmere H, Raghavan R, Hovmand P, Aarons GA, Bunger A, Griffey R, Hensley M: Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research. 2011, 38: 65-76.

Rabin BA, Purcell P, Naveed S, Moser RP, Henton MD, Proctor EK, Brownson RC, Glasgow RE: Advancing the application, quality and harmonization of implementation science measures. Implement Sci. 2012, 7: 1-11.

SIRC Measures Project. http://www.seattleimplementation.org/sirc-projects/sirc-instrument-project/ ,

Forman J, Damschroder L: Qualitative content analysis. Empirical methods for bioethics: A primer. Edited by: Jacoby L, Siminoff LA. 2008, Amsterdam: Elsevier, 11: 39-62. Advances in Bioethics

Hsieh H-F, Shannon SE: Three approaches to qualitative content analysis. Qual Health Res. 2005, 15: 1277-1288.

Kohlbacher F: The use of qualitative content analysis in case study research. Forum: Qualitative Social Research. 2006, 7: http://www.qualitative-research.net/index.php/fqs/article/view/75/154 ,

Mayring P: Qualitative content analysis. Forum: Qualitative Social Research. 2000, 1: http://www.qualitative-research.net/index.php/fqs/article/view/1089/2386 ,

Forsner T, Hansson J, Brommels M, Wistedt AA, Forsell Y: Implementing clinical guidelines in psychiatry: a qualitative study of perceived facilitators and barriers. BMC Psychiatry. 2010, 10: 1-10.

Hysong SJ, Best RG, Pugh JA: Clinical practice guideline implementation strategy patterns in veterans affairs primary care clinics. Health Serv Res. 2007, 42: 84-103.

Magnabosco JL: Innovations in mental health services implementation: a report on state-level data from the U.S. evidence-based practices project. Implement Sci. 2006, 1: 1-11.

Miles MB, Huberman AM: Qualitative Data Analysis. 1994, Thousand Oaks, CA: Sage, 2

Krippendorff K: Content Analysis: An Introduction to Its Methodology. 2003, Thousand Oaks, CA: Sage Publications, 2

Bowman C, Ambrosini V: Using single respondents in strategy research. Br J Manag. 1997, 8: 119-131.

Voss C, Tsikriktsis N, Frohlich M: Case research in operations management. International Journal of Operations & Production Management. 2002, 22: 195-219.

Creswell JW, Plano Clark VL: Designing and Conducting Mixed Methods Research. 2011, Thousand Oaks, CA: Sage, 2

Killaspy H, Johnson S, Pierce B, Bebbington P, Pilling S, Nolan F, King M: Successful engagement: a mixed methods study of the approaches of community treatment and community mental health teams in the REACT trial. Soc Psychiatry Psychiatr Epidemiol. 2009, 44: 532-540.

Isett KR, Burnam MA, Coleman-Beattie B, Hyde PS, Morrissey JP, Magnabosco J, Rapp CA, Ganju V, Goldman HH: The state policy context of implementation issues for evidence-based practices in mental health. Psychiatr Serv. 2007, 58: 914-921.

Chambers DA: Forward. Dissemination and implementation research in health: Translating science to practice. Edited by: Brownson RC, Colditz GA, Proctor EK. 2012, New York: Oxford University Press, vii–x

Scheirer MA: Linking sustainability research to intervention types. Am J Public Health. 2013, 103 (4): e73-e80.

Cook JM, O’Donnell C, Dinnen S, Coyne JC, Ruzek JI, Schnurr PP: Measurement of a model of implementation for health care: toward a testable theory. Implement Sci. 2012, 7: 1-15.

Clancy CM, Slutsky JR: Guidelines for guidelines: we’ve come a long way. CHEST. 2007, 132: 746-747.

Download references

Acknowledgements

Funding for this study has been provided by the National Institute of Mental Health (NIMH) through a National Research Service Award (NRSA) Individual Pre-Doctoral Fellowship (NIMH F31 MH098478; Powell, PI), the Doris Duke Charitable Foundation through a Fellowship for the Advancement of Child Well-Being (administered by Chapin Hall at the University of Chicago), the Fahs-Beck Fund for Research and Experimentation at the New York Community Trust, and the larger randomized clinical trial that is providing the sample of organizations and measure of OSC (NIMH R01 MH084855; Glisson, PI). This project was also made possible by training support from an NIMH NRSA Institutional Pre-Doctoral Fellowship (NIMH T32 MH19960; Proctor, PI) and a National Institutes of Health Pre-Doctoral Institutional Training Fellowship through the Washington University School of Medicine (NIH TL1 RR024995, UL1 RR024992; Polonsky, PI). The protocol was strengthened by the receipt of feedback on preliminary versions presented at the NIMH-funded Seattle Implementation Research Conference on October 13, 2011, and Knowledge Translation Canada’s Summer Institute funded by the Canadian Institute for Health Research on June 5, 2012.

Author information

Authors and affiliations.

Brown School, Washington University in St. Louis, Campus, Box 1196, One Brookings Drive, St. Louis, MO, 63130, USA

Byron J Powell, Enola K Proctor, Patricia L Kohl, Ramesh Raghavan & Ross C Brownson

College of Social Work, University of Tennessee, Knoxville, TN, 37996, USA

Charles A Glisson

Department of Psychiatry, Washington University School of Medicine, St. Louis, MO, 63130, USA

Ramesh Raghavan

Department of Surgery and Siteman Cancer Center, Washington University School of Medicine, MO, St. Louis, 63130, USA

Ross C Brownson

Department of Anthropology, Washington University in St. Louis, MO, St. Louis, 63130, USA

Bradley P Stoner

Department of Internal Medicine, Division of Infectious Diseases, Washington University School of Medicine, St. Louis, MO, 63130, USA

Department of Emergency Medicine, Washington University School of Medicine, St. Louis, MO, 63130, USA

Christopher R Carpenter

School of Social Work, University of Southern California, Los Angeles, California, 90089, USA

Lawrence A Palinkas

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Byron J Powell .

Additional information

Competing interests.

The authors declare that they have no competing interests.

Authors’ contributions

BJP is the principal investigator of the study. BJP generated the idea and designed the study, drafted the manuscript, and approved all changes. EKP is the primary mentor on BJP’s F31 award from the National Institute of Mental Health and the award from the Doris Duke Charitable Foundation. CAG is the principal investigator and EKP is the co-principal investigator of the ARC RCT that provides the context for the current study. CAG is the developer of the OSC survey, and he and his colleagues from the Children’s Mental Health Services Research Center at the University of Tennessee, Knoxville will assist with the analysis and interpretation of that data. EKP, CAG, PLK, RR, RCB, BPS, CRC, and LAP provided input into the design of the study. All authors reviewed and provided feedback for this manuscript. The final version of this manuscript was vetted and approved by all authors.

Electronic supplementary material

Additional file 1: semi-structured interview guide. (pdf 112 kb), additional file 2: focus group interview guide.(pdf 77 kb), rights and permissions.

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( http://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article.

Powell, B.J., Proctor, E.K., Glisson, C.A. et al. A mixed methods multiple case study of implementation as usual in children’s social service organizations: study protocol. Implementation Sci 8 , 92 (2013). https://doi.org/10.1186/1748-5908-8-92

Download citation

Received : 08 July 2013

Accepted : 19 August 2013

Published : 20 August 2013

DOI : https://doi.org/10.1186/1748-5908-8-92

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Implementation strategies
  • Mental health
  • Children and adolescents
  • Mixed methods
  • Multiple case study

Implementation Science

ISSN: 1748-5908

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

mixed methods single case study

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Nepal J Epidemiol
  • v.12(1); 2022 Mar

The Growing Importance of Mixed-Methods Research in Health

Sharada prasad wasti.

1,2 School of Human and Health Sciences, University of Huddersfield, United Kingdom

Padam Simkhada

3 Centre for Midwifery, Maternal and Perinatal Health, Bournemouth University, Bournemouth, United Kingdom

Edwin R. van Teijlingen

Brijesh sathian.

4 Geriatrics and long term care Department, Rumailah Hospital, Hamad Medical Corporation, Doha, Qatar

Indrajit Banerjee

5 Sir Seewoosagur Ramgoolam Medical College, Belle Rive, Mauritius

All authors have made substantial contributions to all of the following: (1) the conception and design of the study (2) drafting the article or revising it critically for important intellectual content, (3) final approval of the version to be submitted

There is no conflict of interest for any author of this manuscript.

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sector.

This paper illustrates the growing importance of mixed-methods research to many health disciplines ranging from nursing to epidemiology. Mixed-methods approaches requires not only the skills of the individual quantitative and qualitative methods but also a skill set to bring two methods/datasets/findings together in the most appropriate way. Health researchers need to pay careful attention to the ‘best’ approach to designing, implementing, analysing, integrating both quantitative (number) and qualitative (word) information and writing this up in a way offers greater insights and enhances its applicability. This paper highlights the strengths and weaknesses of mixed-methods approaches as well as some of the common mistakes made by researchers applying mixed-methods for the first time.

Quantitative and qualitative research methods each address different types of questions, collect different kinds of data and deliver different kinds of answers. Each set of methods has its own inherent strengths and weaknesses, and each offers a particular approach to address specific types of research questions (and agendas). Health disciplines such as dentistry, nursing, speech and language therapy, and physiotherapy often use either quantitative or qualitative research methods on their own. However, there is a steadily growing literature showing the advantages of mixed-methods research is used in the health care and health service field [ 1-2 ]. Although we have advocated the use of mixed-methods in this journal eight years ago [ 3 ], there is still not enough mixed-methods research training in the health research field, particularly for health care practitioners, such as nurses, physiotherapists, midwives, and doctors, wanting to do research. Mixed-methods research has been popular in the social sciences since the twentieth century [ 4 ], and it has been growing in popularity among healthcare professionals [ 5 ], although it is still underdeveloped in disciplines such nursing and midwifery [ 6 , 7 ].

Underpinning philosophies

To help understand that mixed-methods research is not simply employing two different methods in the same study, one needs to consider their underpinning research philosophies (also called paradigms). First, quantitative research is usually underpinned by positivism. This includes most epidemiological studies; such research is typically based on the assumption that there is one single real world out there that can be measured. For example, quantitative research would address the question “What proportion of the population of India drinks coffee?” Secondly, qualitative research is more likely to be based on interpretivism. This includes research based on interviews and focus groups, research which us is typically based on the assumption that we all experience the world differently. Since we all live in a slightly different world in our heads the task of qualitative research is to analyse the interpretations of the people in the sample. For example, qualitative research would address the question “How do people experience drinking coffee in India?”, and “What does drinking coffee mean to them?”

Mixed-methods research brings together questions from two different philosophies in what is being referred to as the third path [ 8 ], third research paradigm [ 9 , 10 ], the third methodology movement [ 11 , 12 ] and pragmatism [ 5 ]. The two paradigms differ in key underlying assumptions that ultimately lead to choices in research methodology and methods and often give a breadth by answering more complicated research questions [ 4 ]. The roles of mixed-methods are clear in an understanding of the situation (the what), meaning, norms, values (the why or how) within a single research question which combine the strength of two different method and offer multiple ways of looking at the research question [ 13 ]. Epidemiology sits strongly in the quantitative research corner, with a strong emphasis on large data sets and sophisticated statistical analysis. Although the use of mixed methods in health research has been discussed widely researchers raised concerns about the explanation of why and how mixed methods are used in a single research question [ 5 ].

The relevance of mixed-methods in health research

The overall goal of the mixed-methods research design is to provide a better and deeper understanding, by providing a fuller picture that can enhance description and understanding of the phenomena [ 4 ]. Mixed-methods research has become popular because it uses quantitative and qualitative data in one single study which provides stronger inference than using either approach on its own [ 4 ]. In other words, a mixed-methods paper helps to understand the holistic picture from meanings obtained from interviews or observation to the prevalence of traits in a population obtained from surveys, which add depth and breadth to the study. For example, a survey questionnaire will include a limited number of structured questions, adding qualitative methods can capture other unanticipated facets of the topic that may be relevant to the research problem and help in the interpretation of the quantitative data. A good example of a mixed-methods study, it one conducted in Australia to understand the nursing care in public hospitals and also explore what factors influence adherence to nursing care [ 14 ]. Another example is a mixed-methods study that explores the relationship between nursing care practices and patient satisfaction. This study started with a quantitative survey to understand the general nursing services followed by qualitative interviews. A logistic regression analysis was performed to quantify the associations between general nursing practice variables supplemented with a thematic analysis of the interviews [ 15 ]. These research questions could not be answered if the researchers had used either qualitative or quantitative alone. Overall, this fits well with the development of evidence-based practice.

Despite the strengths of mixed-methods research but there is not much of it in nursing and other fields [ 7 ]. A recent review paper shows that the prevalence of mixed-methods studies in nursing was only 1.9% [ 7 ]. Similarly, a systematic review synthesised a total of 20 papers [ 16 ], and 16 papers [ 17 ] on nursing-related research paper among these only one mixed-methods paper was identified. Worse, a further two mixed-methods review recently revealed that out of 48 [ 18 , 19 ] synthesised nursing research papers, not one single mixed-methods paper was identified. This clearly depicts that mixed-methods research is still in its infancy stage in nursing but we can say there is huge scope to implement it to understand research questions on both sides of coin [ 4 ]. Therefore, there is a great need for mixed-methods training to enhance the evidence-based decision making in health and nursing practices.

Strengths and weaknesses of mixed-methods

There are several challenges in identifying expertise of both methods and in working with a multidisciplinary, interdisciplinary, or transdisciplinary team [ 20 ]. It increases costs and resources, takes longer to complete as mixed-methods design often involves multiple stages of data collection and separate data analysis [ 4 , 5 ]. Moreover, conducting mixed-methods research does not necessarily guarantee an improvement in the quality of health research. Therefore, mixed-methods research is only appropriate when there are appropriate research questions [ 4 , 6 ].

Identifying an appropriate mixed-methods journal can also be challenging when writing mixed-methods papers [ 21 ]. Mixed-methods papers need considerably more words than single-methods papers as well as sympathetic editors who understand the underlying philosophy of a mixed-methods approach. Such papers, simply require more words. The mixed-methods researcher must be reporting two separate methods with their own characteristics, different samples, and ways of analysing, therefore needs more words to describe both methods as well as both sets of findings. Researcher needs to find a journal that accepts longer articles to help broaden existing evidence-based practice and promote its applicability in the nursing field [ 22 ].

Common mistakes in applying mixed-methods

Not all applied researchers have insight into the underlying philosophy and/or the skills to apply each set of methods appropriately. Younas and colleagues’ review identified that around one-third (29%) of mixed-methods studies did not provide an explicit label of the study design and 95% of studies did not identify the research paradigm [ 7 ]. Whilst several mixed-methods publications did not provide clear research questions covering both quantitative and qualitative approaches. Another common issue is how to collect data either concurrent or sequential and the priority is given to each approach within the study where equal or dominant which are not clearly stated in writing which is important to mention while writing in the methods section. Similarly, a commonly overlooked aspect is how to integrate both findings in a paper. The responsibility lies with the researcher to ensure that findings are sufficiently plausible and credible [ 4 ]. Therefore, intensive mixed-methods research training is required for nursing and other health practitioners to ensure its appropriate.

The way forward

Despite the recognised strengths and benefits of doing mixed-methods research, there is still only a limited number of nursing and related-health research publications using such this approach. Researchers need training in how to design, conduct, analyse, synthesise and disseminate mixed-methods research. Most importantly, they need to consider appropriate research questions that can be addressed using a mixed methods approach to add to our knowledge in evidence-based practice. In short, we need more training on mixed-methods research for a range of health researchers and health professionals.

Acknowledgement

IMAGES

  1. mixed method case study research

    mixed methods single case study

  2. Mixed Methods Single Case Research: State of the Art and Future

    mixed methods single case study

  3. Two Methodological Approaches to the Integration of Mixed Methods and

    mixed methods single case study

  4. mixed methods research and case study

    mixed methods single case study

  5. Diagram for mixed methods embedded design with case study follow-up

    mixed methods single case study

  6. (PDF) Combining Mixed Methods and Case Study Research (MM+CSR) to Give

    mixed methods single case study

VIDEO

  1. The three types of research methods #reseach #study

  2. RESEARCH CRITIQUE Mixed Methods Study

  3. Writing Mixed Methods in Research

  4. key elements in the research methods

  5. Types of mixed method research designs

  6. Sampling in Mixed Methods Research: An Introduction

COMMENTS

  1. Mixed Methods Single Case Research: State of the Art and Future

    Mixed methods single case research (MMSCR) is research in which single case experimental and qualitative case study methodologies, and their accompanying sets of methods and techniques, are integrated to answer research questions that concern a single case. This article discusses the historical roots and the distinct nature of MMSCR, the kinds ...

  2. How to Construct a Mixed Methods Research Design

    One can use mixed methods to examine different aspects of a single research question, or one can use separate but related qualitative and quantitative research questions. ... or "qualitative"; in the second case, it is called "deductive" or "quantitative". In the case of mixed methods, the component that corresponds to the ...

  3. Mixed Methods Research

    Mixed methods research combines elements of quantitative research and qualitative research in order to answer your research question. Mixed methods can help you gain a more complete picture than a standalone quantitative or qualitative study, as it integrates benefits of both methods. Mixed methods research is often used in the behavioral ...

  4. PDF Mixed Methods Case Study Research

    MMCSR. "A mixed methods case study design is a type of mixed methods study in which the quantitative and qualitative data collection, results, and integration are used to provide in-depth evidence for a case(s) or develop cases for comparative analysis" (Creswell & Plano Clarke, 2018, p.116).

  5. (PDF) Mixed Methods Single Case Research: State of the Art and Future

    Likewise, mixed methods single case research is a possible design in which experimental and qualitative case study methods, and results are integrated to answer research questions about a single ...

  6. How to … do mixed‐methods research

    Mixed‐methods research, or multi‐strategy designs, 1 can be defined as 'the collection, analysis and integration of both qualitative and quantitative data in a single study': 2 semi‐structured interviews and workplace measures (e.g. attendance data) might be undertaken concurrently to gain a multifaceted perspective on a particular ...

  7. Using mixed methods in health research

    Summary. Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research.

  8. [PDF] Mixed Methods Single Case Research: State of the Art and Future

    Mixed methods single case research (MMSCR) is research in which single case experimental and qualitative case study methodologies, and their accompanying sets of methods and techniques, are integrated to answer research questions that concern a single case. This article discusses the historical roots and the distinct nature of MMSCR, the kinds ...

  9. Mixed-Methods Designs

    A case study is "an intensive study of a single case or a small number of cases which draws on observational data and promises to shed light on a larger population of cases" (Gerring 2017a: 28).As case study research is usually based on rich empirical data, it displays a huge propensity for the use of mixed methods (Kitchenham 2010), so as to extend and generalize the findings from in ...

  10. Combining Mixed Methods and Case Study Research (MM+CSR) to Give Mixed

    However, the increased use of mixed methods to address complex research prob- lems provides an opportunity to combine mixed methods research (MMR) and CSR in a single investigation.

  11. Intersecting Mixed Methods and Case Study Research: Design

    Methods This study is an observational mixed-methods single case study of improvement efforts at a Swedish acute care hospital, which triangulates control chart analysis of process performance ...

  12. Single-Case Designs and Qualitative Methods: Applying a Mixed Methods

    Abstract. The purpose of this conceptual paper is to describe a design that mixes single-case (sometimes referred to as single-subject) and qualitative methods, hereafter referred to as a single-case mixed methods design (SCD-MM). Minimal attention has been given to the topic of applying qualitative methods to SCD work in the literature.

  13. Variations of mixed methods reviews approaches: A case study

    This case study analyzed 30 mixed methods reviews conducted in one organization and revealed five key dimensions on which reviews varied: types of questions answered and pur-poses of the mixed methods questions, types of evidence and sources, integration strategies and reasons for using a mixed methods approach.

  14. Intersecting Mixed Methods and Case Study Research: Design

    This research was done intentionally by the census method and case study research design and data collection by interview. The results showed four marketing channels for tilapia, namely I: cultivators and collectors, II: cultivators, collectors, and retailers, III: cultivators, collectors and wholesalers and IV: farmers, traders, wholesalers ...

  15. Combining qualitative and quantitative research within mixed method

    When qualitative and quantitative methods are mixed in a single study, one method is usually given priority over the other. In such cases, the aim of the study, the rationale for employing mixed methods, and the weighting of each method determine whether, and how, the empirical findings will be integrated. ... K., Moravan, V., 2008. Using mixed ...

  16. A mixed methods multiple case study of implementation as usual in

    Overview. This study employs a mixed methods multiple case study design, in which each participating organization (n = 7) is conceptualized as a 'case' [56, 57].Case studies are particularly helpful in understanding the internal dynamics of change processes, and including multiple cases capitalizes on organizational variation and permits an examination of how contextual factors influence ...

  17. Two Methodological Approaches to the Integration of Mixed Methods and

    Both case study and mixed methods, alone, are popular approaches to research and evaluation. Mixed methods research combines qualitative and quantitative research in a study, or closely related series of studies, through the collection, analysis, and integration of qualitative and quantitative data (Creswell, 2015a).Mixed methods is growing in its adoption worldwide as evidenced by continual ...

  18. Sustainability

    This research adopts a mixed-methods methodology that combines spatial mapping and field observations. A diverse range of data sources was utilized to calculate the five categories of dynamic indicators and establish connections with observed environmental behaviors. ... A Case Study in Shanghai Utilizing Mixed-Methods Approach and Artificial ...

  19. The Growing Importance of Mixed-Methods Research in Health

    The relevance of mixed-methods in health research. The overall goal of the mixed-methods research design is to provide a better and deeper understanding, by providing a fuller picture that can enhance description and understanding of the phenomena [].Mixed-methods research has become popular because it uses quantitative and qualitative data in one single study which provides stronger inference ...

  20. Development of Pipeline Transient Mixed Flow Model with Smoothed ...

    The accurate modeling and understanding of complex transient mixed pipe flows is crucial for the optimal design and safe and efficient operation in pipeline systems such as urban drainage systems. Currently, the predominant approach for modeling free-surface-pressurized flows relies on grid-based numerical schemes, with comparatively limited capability for exploring its complex phenomena.

  21. Variations of mixed methods reviews approaches: A case study

    This case study analyzed 30 mixed methods reviews conducted in one organization and revealed five key dimensions on which reviews varied: types of questions answered and purposes of the mixed methods questions, types of evidence and sources, integration strategies and reasons for using a mixed methods approach. ... 2 METHODS. A single case ...