Which of the qualitative research sampling methods is least suitable when conducting an observation?

1. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in child welfare. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38:4–23. [PMC free article] [PubMed] [Google Scholar]

2. Aarons GA, Palinkas LA. Implementation of evidence-based practice in child welfare: service provider perspectives. Administration and Policy in Mental Health and Mental Health Services Research. 2007;34:411–419. [PubMed] [Google Scholar]

3. Bachman MO, O’Brien M, Husbands C, Shreeve A, Jones N, Watson J, Reading R, Thoburn J, Mugford M, the National Evaluation of Children’s Trusts Team Integrating children’s services in England: national evaluation of children’s trusts. Child: Care, Health and Development. 2009;35:257–265. [PubMed] [Google Scholar]

4. Bernard HR. Research methods in anthropology: Qualitative and quantitative approaches. 3rd Alta Mira Press; Walnut Creek, CA: 2002. [Google Scholar]

5. Bloom HS, Michalopoulos C. When is the story in the subgroups? Strategies for interpreting and reporting intervention effects for subgroups. Prevention Science. 2013;14:179–188. [PubMed] [Google Scholar]

6. Brown CH, Wyman PA, Guo J, Peña J. Dynamic wait-listed designs for randomized trials: New designs for prevention of youth suicide. Clinical Trials. 2006;3:259–271. [PubMed] [Google Scholar]

7. Brown CH, Wang W, Kellam SG, Muthén, Prevention Science and Methodology Group Methods for testing theory and evaluating impact in randomized field trials: Intent-to-treat analyses for integrating the perspectives of person, place, and time. Drug and Alcohol Dependence. 2008;S95:S74–S104. … . [PMC free article] [PubMed] [Google Scholar]

8. Brown C, Ten Have T, Jo B, Dagne G, Wyman P, Muthén B, et al. Adaptive designs for randomized trials in public health. Annual Review of Public Health. 2009;30:1–25. [PMC free article] [PubMed] [Google Scholar]

9. Brunette MF, Asher D, Whitley R, Lutz WJ, Weider BL, Jones AM, McHugo GJ. Implementation of integrated dual disorders treatment: a qualitative analysis of facilitators and barriers. Psychiatric Services. 2008;59:989–995. [PubMed] [Google Scholar]

10. Cresswell JW, Plano Clark VL. Designing and conducting mixed method research. 2nd Sage; Thousand Oaks, CA: 2011. [Google Scholar]

11. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: Combining elements of clinical effectiveness and implementation research to enhance public health impact. Medical Care. 2012;50:217–226. [PMC free article] [PubMed] [Google Scholar]

12. Denzen NK. The research act: A theoretical introduction to sociological methods. 2nd McGraw Hill; New York: 1978. [Google Scholar]

13. Duan N, Bhaumik DK, Palinkas LA, Hoagwood K. Purposeful sampling and optimal design. Administration and Policy in Mental Health and Mental Health Services Research. this issue. [PMC free article] [PubMed]

14. Glaser BG. Theoretical sensitivity. Sociology Press; Mill Valley, CA: 1978. [Google Scholar]

15. Glaser BG, Straus AL. The Discovery of grounded theory: Strategies for qualitative research. Aldine de Gruyter; New York: 1967. [Google Scholar]

16. Glasgow R, Magid D, Beck A, Ritzwoller D, Estabrooks P. Practical clinical trials for translating research to practice: design and measurement recommendations. Medical Care. 2005;43(6):551. [PubMed] [Google Scholar]

17. Green AE, Aarons GA. A comparison of policy and direct practice stakeholder perceptions of factors affecting evidence-based practice implementation using concept mapping. Implementation Science. 2011;6:104. [PMC free article] [PubMed] [Google Scholar]

18. Guest G, Bunce A, Johnson L. How many interviews are enough? An experiment with data saturation and variability. Field Methods. 2006;18(1):59–82. [Google Scholar]

19. Henke RM, Chou AF, Chanin JC, Zides AB, Scholle SH. Physician attitude toward depression care interventions: implications for implementation of quality improvement initiatives. Implementation Science. 2008;3:40. [PMC free article] [PubMed] [Google Scholar]

20. Hoagwood KE, Vogel JM, Levitt JM, D’Amico PJ, Paisner WI, Kaplan SJ. Implementing an evidence-based trauma treatment in a state system after September 11: the CATS Project. Journal of the American Academy of Child and Adolescent Psychiatry. 2007;46(6):773–779. [PubMed] [Google Scholar]

21. Kemper EA, Stringfield S, Teddlie C. Mixed methods sampling strategies in social science research. In: Tashakkori A, Teddlie C, editors. Handbook of mixed methods in the social and behavioral sciences. Sage; Thousand Oaks, CA: 2003. pp. 273–296. [Google Scholar]

22. Kramer TF, Burns BJ. Implementing cognitive behavioral therapy in the real world: a case study of two mental health centers. Implementation Science. 2008;3(14) [PMC free article] [PubMed] [Google Scholar]

23. Landsverk J, Brown H, Chamberlain P, Palinkas LA, Horwitz SM. Design and analysis in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, editors. Translating science to practice. Oxford University Press; New York: 2012. pp. 225–260. [Google Scholar]

24. Marshall T, Rapp CA, Becker DR, Bond GR. Key factors for implementing supported employment. Psychiatric Services. 2008;59:886–892. [PubMed] [Google Scholar]

25. Marty D, Rapp C, McHugo G, Whitley R. Factors influencing consumer outcome monitoring in implementation of evidence-based practices: results from the National EBP Implementation Project. Administration and Policy in Mental Health. 2008;35:204–211. [PubMed] [Google Scholar]

26. Miles MB, Huberman AM. Qualitative data analysis: An expanded sourcebook. 2nd Sage; Thousand Oaks, CA: 1994. [Google Scholar]

27. Minkler M, Wallerstein N, editors. Community-based participatory research for health. Jossey-Bass; San Francisco, CA: 2003. [Google Scholar]

28. Morgan DL. Focus groups as qualitative research. Sage; Newbury Park, CA: 1997. [Google Scholar]

29. Morse JM, Niehaus L. Mixed method design: Principles and procedures. Left Coast Press; Walnut Creek, CA: 2009. [Google Scholar]

30. Padgett DK. Qualitative methods in social work research. 2nd Sage; Los Angeles: 2008. [Google Scholar]

31. Palinkas LA, Aarons GA, Horwitz SM, Chamberlain P, Hurlburt M, Landsverk J. Mixed method designs in implementation research. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38:44–53. [PMC free article] [PubMed] [Google Scholar]

32. Palinkas LA, Ell K, Hansen M, Cabassa LJ, Wells AA. Sustainability of collaborative care interventions in primary care settings. Journal of Social Work. 2011;11:99–117. [Google Scholar]

33. Palinkas LA, Fuentes D, Garcia AR, Finno M, Holloway IW, Chamberlain P. Administration and Policy in Mental Health and Mental Health Services Research. Aug 12, 2012. Inter-organizational collaboration in the implementation of evidence-based practices among agencies serving abused and neglected youth. epub ahead of print DOI 10.1007/s10488-012-0437-5. [PubMed] [Google Scholar]

34. Palinkas LA, Holloway IW, Rice E, Fuentes D, Wu Q, Chamberlain P. Social networks and implementation of evidence-based practices in public youth-serving systems: A mixed methods study. Implementation Science. 2011;6:113. [PMC free article] [PubMed] [Google Scholar]

35. Palinkas LA, Soydan H. Translation and implementation of evidence-based practice. Oxford University Press; New York: 2012. [Google Scholar]

36. Patton MQ. Qualitative research and evaluation methods. 3rd Sage Publications; Thousand Oaks, CA: 2002. [Google Scholar]

37. Proctor EK, Knudsen KJ, Fedoracivius N, Hovmand P, Rosen A, Perron B. Implementation of evidence-based practice in community behavioral health: agency director perspectives. Administration and Policy in Mental Health. 2007;34:479–488. [PubMed] [Google Scholar]

38. Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman C. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research. 2009;36:24–34. [PMC free article] [PubMed] [Google Scholar]

39. Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, Callaghan J, Holter M. Barriers to evidence-based practice implementation: results of a qualitative study. Community Mental Health Journal. 2010;46:112–118. [PubMed] [Google Scholar]

40. Raudenbush S, Liu X. Statistical power and optimal design for multisite randomized trials. Psychological Methods. 2000;5:199–213. [PubMed] [Google Scholar]

41. Slade M, Gask L, Leese M, McCrone P, Montana C, Powell R, Stewart M, Chew-Graham Cl. Failure to improve appropriateness of referrals to adult community mental health services – lessons from a multi-site cluster randomized controlled trial. Family Practice. 2008;25:181–190. [PubMed] [Google Scholar]

42. Spradley JP. The ethnographic interview. Holt, Rinehart & Winston; New York: 1979. [Google Scholar]

43. Swain K, Whitley R, McHugo GJ, Drake RE. The sustainability of evidence-based practices in routine mental health agencies. Community Mental Health Journal. 2010;46:119–129. [PubMed] [Google Scholar]

44. Teddlie C, Tashakkori A. Major issues and controversies in the use of mixed methods in the social and behavioral sciences. In: Tashakkori A, Teddlie C, editors. Handbook of mixed methods in the social and behavioral sciences. Sage; Thousand Oaks, CA: 2003. pp. 3–50. [Google Scholar]

45. Tunis SR, Stryer DB, Clancey CM. Increasing the value of clinical research for decision making in clinical and health policy. Journal of the American Medical Association. 2003;290:1624–1632. 2003. [PubMed] [Google Scholar]

46. Wisdom JP, Cavaleri MC, Onwuegbuzie AT, Green CA. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles. Health Services Research. 2011;47:721–745. [PMC free article] [PubMed] [Google Scholar]

47. Woltmann EM, Whitley R, McHugo GJ, et al. The role of staff turnover in the implementation of evidence-based practices in health care. Psychiatric Services. 2008;59:732–737. [PubMed] [Google Scholar]

48. Zazzali JL, Sherbourne C, Hoagwood KE, Greene D, Bigley MF, Sexton TL. The adoption and implementation of an evidence based practice in child and family mental health services organizations: a pilot study of functional family therapy in New York State. Administration and Policy in Mental Health. 2008;35:38–49. [PubMed] [Google Scholar]


Page 2

Purposeful sampling strategies in implementation research

StrategyObjectiveExampleConsiderations
Emphasis on similarity
Criterion-iTo identify and select all cases that meet some predetermined criterion

of importance

Selection of consultant trainers and program leaders at study sites to facilitators and barriers to EBP implementation

(Marshall et al., 2008).

Can be used to identify cases from standardized questionnaires for in-depth follow-up

(Patton, 2002)

Criterion-eTo identify and select all cases that exceed or fall outside a specified

criterion

Selection of directors of agencies that failed to move to the next stage of implementation within expected period

of time.

Typical caseTo illustrate or highlight what is typical, normal

or average

A child undergoing treatment for trauma

(Hoagwood et al., 2007)

The purpose is to describe and illustrate what is typical to those unfamiliar with the setting, not to make generalized statements about the experiences of all participants

(Patton, 2002).

HomogeneityTo describe a particular subgroup in depth, to reduce variation, simplify analysis and facilitate group

interviewing

Selecting Latino/a directors of mental health services agencies to discuss challenges of implementing evidence-based treatments for mental health problems

with Latino/a clients.

Often used for selecting
focus group participants
SnowballTo identify cases of interest from sampling people who know people that generally have similar characteristics who, in turn know people, also with similar

characteristics.

Asking recruited program managers to identify clinicians, administrative support staff, and consumers for project recruitment

(Green & Aarons, 2011).

Begins by asking key informants or well-situated people “Who knows a lot about…”

(Patton, 2001)

Extreme or deviant caseTo illuminate both the
unusual and the typical
Selecting clinicians from state agencies or mental health with best and worst performance records or implementation

outcomes

Extreme successes or failures may be discredited as being too extreme or unusual to yield useful information, leading one to select cases that manifest sufficient intensity to illuminate the nature of success or failure, but not in the

extreme.

Emphasis on variation
IntensitySame objective as extreme case sampling but with less emphasis

on extremes

Clinicians providing usual care and clinicians who dropped out of a study prior to consent to contrast with clinicians who provided the intervention under investigation.

(Kramer & Burns, 2008)

Requires the researcher to do some exploratory work to determine the nature of the variation of the situation under study, then sampling intense examples of the phenomenon of

interest.

Maximum variationImportant shared patterns that cut across cases and derived their significance from having emerged out of

heterogeneity.

Sampling mental health services programs in urban and rural areas in different parts of the state (north, central, south) to capture maximum variation in location

(Bachman et al., 2009).

Can be used to document unique or diverse variations that have emerged in adapting to different conditions

(Patton, 2002).

Critical caseTo permit logical generalization and maximum application of information because if it is true in this one case, it’s likely to be

true of all other cases

Investigation of a group of agencies that decided to stop using an evidence-based practice to identify reasons for lack of EBP

sustainment.

Depends on recognition of key dimensions that make for a critical case. Particularly important when resources may limit the study of only one site (program, community, population)

(Patton, 2002)

Theory-basedTo find manifestations of a theoretical construct so as to elaborate and examine the construct and its

variations

Sampling therapists based on academic training to understand the impact of CBT training versus psychodynamic training in graduate school of

acceptance of EBPs

Sample on the basis of potential manifestation or representation of important theoretical constructs. Sampling on the basis of emerging concepts with the aim being to explore the dimensional range or varied conditions along which the properties of

concepts vary.

Confirming and
disconfirming case
To confirm the importance and meaning of possible patterns and checking out the viability of emergent findings with new data and additional

cases.

Once trends are identified, deliberately seeking examples that are counter to the

trend.

Usually employed in later phases of data collection. Confirmatory cases are additional examples that fit already emergent patterns to add richness, depth and credibility. Disconfirming cases are a source of rival interpretations as well as a means for placing boundaries around

confirmed findings

Stratified purposefulTo capture major variations rather than to identify a common core, although the latter may emerge in

the analysis

Combining typical case sampling with maximum variation sampling by taking a stratified purposeful sample of above average, average, and below average cases of health care expenditures for a

particular problem.

This represents less than the full maximum variation sample, but more than simple

typical case sampling.

Purposeful randomTo increase the
credibility of results
Selecting for interviews a random sample of providers to describe experiences with EBP

implementation.

Not as representative of the population as a probability random

sample.

Nonspecific emphasis
Opportunistic or
emergent
To take advantage of circumstances, events and opportunities for additional data

collection as they arise.

Usually employed when it is impossible to identify sample or the population from which a sample should be drawn at the outset of a study. Used primarily in conducting

ethnographic fieldwork

ConvenienceTo collect information from participants who are easily accessible to

the researcher

Recruiting providers attending a staff meeting for study

participation.

Although commonly used, it is neither

purposeful nor strategic