Original ArticleA hierarchy of evidence for assessing qualitative health research
Introduction
In the medical and health literature, there has been a steady rise in the number of papers reporting studies that use qualitative research method. Although some of these papers aim to develop theory or to advance method, a substantial proportion of papers report on issues directly relevant to clinical practice or health policy. What may not be clear to decision makers is how useful qualitative research is for generating the evidence that underlies evidence-based medicine.
The Cochrane Collaboration has struggled to find ways of incorporating evidence from qualitative research into systematic reviews. Articles reporting qualitative systematic reviews are now making their appearance [1], [2]. The challenge that these “courageous” researchers face, Jennie Popay tells us, is that the most difficult problem remains unresolved, that of defining clear criteria for selecting high-quality qualitative studies for inclusion in reviews [3]. Criteria for judging the quality of quantitative studies [4] are well known as is the hierarchy of designs based on the strength of the evidence for treatment decisions [5], but what might be the criteria for studies that are based on narrative argument rather than on measurement and control over variables?
The Evidence-based Medicine Working Group has generated a description of an ideal qualitative study [5] and this echoes the “appraisal tool” produced by the Critical Skills Appraisal Programme in the Public Health Resource Unit of the National Health Service [6], a substantial document from the United Kingdom's Government Chief Social Researcher's Office, [7] and the more succinct but thoughtful account of the Medical Sociology Group of the British Sociological Association [8].
A major problem is that these guidelines tend to see qualitative research methods as one method despite a diversity that includes anything from discourse analysis to ethnography, with data collected in personal interviews or focus groups (with samples of varying sizes), by participant observation or through documentary analysis. Some guidelines include discussion of the ethics of research with human participants, ways of reporting back to research participants, and ways of including communities in the actual conduct of a study. Commonly there is discussion of the need for “critical reflexivity” concerning the participative role of the researcher in the actual conduct of the research, and consideration may be given to the importance of poetics or esthetic narratives in reporting a study [9].
Not surprisingly, advice is inconsistent between guidelines. It is the flexibility of qualitative method, its capacity for adaptation to a variety of research settings, that is seen as one of its strengths [10] but it is this same flexibility that generates a range of study designs not easily captured in a single set of quality criteria. It is also true that quality criteria are viewed with concern by qualitative researchers. There is doubt about whether a “checklist” approach can capture the nuances and intricacies of the approach and concern that an emphasis on evidence will undermine the insight that can flow from a qualitative study [9]. A useful typology of qualitative studies has been reported for qualitative “metasynthesis,” [11] and “meta-ethnography” is seen as promising [12] but these methods require further development and, again, are undermined by failure to define clear criteria for judging the quality of included studies.
We are not in any doubt that experienced qualitative researchers can tell a good study from a poor one without the need for guidelines but readers of health journals, and even some health researchers, approach qualitative research papers with varying degrees of confidence, skill, and experience. A novice to the method, faced with reviewing a paper using qualitative research methods, might feel that the methodological intricacies of the method are overwhelming and, instead, focus on that which readily distinguishes qualitative research—quotations from interviews.
Words are seductive. The emotive quality of quotations from qualitative studies can draw a sympathetic response from reviewers and readers—and these quotations do enliven an otherwise dull research report. But quotations are not self-validating and require analysis. If qualitative research is to be used as the basis for health care practice and health policy decisions, we need to identify studies that give moving insights into the lives of participants but then, in addition, report sound methods and defensible conclusions that apply to more than the immediate group of research participants.
This paper started as an exercise in critical appraisal of recent qualitative studies in the health literature. We found little difficulty in identifying a large number of sound qualitative studies but then faced the difficulty that some of these studies clearly provide better evidence-for-practice than others. What we report here is a ranking of qualitative study designs in order of the quality of evidence that a well-conducted study offers when practical decisions have to be made. As in quantitative research, study designs range from limited but insight-provoking single case studies to more complex studies that control for bias.
Glasziou et al. [13] acknowledge that the quantitative evidence hierarchy for assessing the effect of interventions suffers from the problem of classifying many different aspects of research quality under a single grade, but we agree with these authors that there is a need “to broaden the scope by which evidence is assessed.” Explicit criteria for assessing qualitative research would assist in transparency in peer review of research papers and a qualitative hierarchy of evidence would help practitioners identify that research which provides the strongest basis for action.
Here, a word of caution is appropriate. The evidence that is likely to emerge from a qualitative study will look quite different from evidence that is generated by a randomized controlled trial. Although qualitative studies may illuminate treatment issues, for example, indicating why some patients respond in a particular way to treatment, it is also common for a qualitative study to generate critique of current practice, indicating where standard practice may not be beneficial to one or more groups of people. We also include under the term, evidence for or against public health or prevention programs and evidence relevant to the formulation of better health policy.
Section snippets
The qualitative research task
We take for granted that there are standard processes for conducting any research study: matching method to research problem, ethical considerations, and reporting requirements. Our focus in this paper is on the central part of the research process, the conduct and reporting of a qualitative research study. We confine ourselves to interview studies as the most commonly used in health research. First, we describe standard qualitative research procedures and then show how these contribute to a
A qualitative hierarchy of evidence-for-practice
The hierarchy we are proposing is summarized in Fig. 1 and Table 1. The emphasis in this hierarchy is on the capacity of reported research to provide evidence-for-practice or policy. In common with the quantitative hierarchies of method, research using methods lower in the hierarchy can be well worth publishing because it contributes to our understanding of a problem. Sometimes, especially in new or difficult research contexts, studies are constrained and the conclusions tend to be hypothesis
Discussion
There are risks in reducing a complex set of professional research procedures to a simple code and, in common with our colleagues in evidence-based medicine, we recognize that hierarchies of evidence-for-practice can be used and abused [9]. Our focus here is not on papers that have the primary aim of developing theory or method. We have located four distinct qualitative research designs for interview studies in a hierarchy that reflects the validity of conclusions for clinical practice and
Conclusion
In defining the essential features of each stage of the central methodological task of a qualitative research study, we are setting in place a model of the ideal research project for developing qualitative evidence-for-practice. If the ideal generalizable study is realized, we should have a research study that provides evidence that is secure, evidence that a reader can trust, and evidence that a policy maker or practitioner can use with confidence as the basis for decision making and policy
References (20)
- et al.
Systematic review of qualitative studies exploring parental beliefs and attitudes toward childhood vaccination identifies common barriers to vaccination
J Clin Epidemiol
(2005) - et al.
Systematically reviewing qualitative studies complements survey design: an exploratory study of barriers to paediatric immunisations
J Clin Epidemiol
(2005) Moving beyond floccinaucinihilipilification: enhancing the utility of systematic reviews
J Clin Epidemiol
(2005)- et al.
Evaluating meta-ethnography: a synthesis of qualitative research on lay experiences of diabetes and diabetes care
Soc Sci Med
(2003) Doing health, doing gender: teenagers, diabetes and asthma
Soc Sci Med
(2000)- et al.
Preventing hepatitis C: ‘Common sense’, ‘the bug’ and other perspectives from the risk narratives of people who inject drugs
Soc Sci Med
(2004) Checklists for reviewing articles
- Critical Appraisal Skills Programme. Milton Keynes Primary Care Trust;...
- Spencer L, Ritchie J, Lewis J, Dillon L. Quality in Qualitative Evaluation: a framework for assessing research...
Cited by (384)
The Impact of Machine Learning on Total Joint Arthroplasty Patient Outcomes: A Systemic Review
2023, Journal of Arthroplasty‘Equipping health care partners to be better team members; A qualitative exploration of interprofessional education and practice in a Danish elite athletic health and performance practice context’
2023, Journal of Interprofessional Education and Practice