204230
Beyond Polemics: The Merits and Challenges of Assessing Intercoder Agreement in a Qualitative Research Study
Karen Andes, PhD
,
Department of Global Health, Emory University, Atlanta, GA
Amy Cassidy
,
Research Department, Family Planning Council, Philadelphia, PA
Mark M. Macauda, MPH, PhD
,
Center for Health, Intervention and Prevention, University of Connecticut, Storrs, CT
Dugeidy Ortiz, MA
,
Anthropology, University of Connecticut, Storrs, CT
Linda Hock-Long, PhD
,
Research Department, Family Planning Council, Philadelphia, PA
Pamela Erickson, DrPH, PhD
,
Anthropology, University of Connecticut, Storrs, CT
Most researchers agree that assessing intercoder agreement is a key element of qualitative data analysis. However, there is little agreement about how best to assess it. The use of quantitative measures to assess intercoder reliability has been criticized on a variety of grounds. Yet, few alternatives exist for ensuring consistent and reliable coding of complex qualitative data. This paper describes the merits and challenges of assessing intercoder agreement using quantitative means (e.g. kappa scores) for the complex qualitative data from the PHRESH.comm project and offers additional practical tools for conducting such assessments. We describe the processes of developing the codebook, applying codes, assessing reliability, and reconciling coding differences, including the project's development of a typology of coding disagreements during the process of reconciliation. We also present strategies to address coder disagreements, such as limiting the number and nature of codes, reading and discussing text related to disagreements in order to refine code definitions and develop a common understanding of them, and discussing the nature of segmentation and the relative importance of including context versus potentially “missing” text. Finally, we outline our experience with consensus coding and adjunct coding review as alternative means to maintain coding rigor with less than optimal kappa scores from complex data.
Learning Objectives: 1) Discuss the historical underpinnings of the current debate over the use of quantitative measures to assess intercoder reliability in qualitative research.
2) Describe the PHRESH.comm approach to codebook development, applying codes, assessing reliability, and reconciling coding differences.
3) Name three strategies to address and reconcile coding disagreements.
Presenting author's disclosure statement:Qualified on the content I am responsible for because: I have a PhD in medical anthropology and have worked in the field of MCH/reproductive health at the CDC since 1991. I have 5 peer reviewed publications and have previously presented at national public health and disciplinary conferences on related topics.
Any relevant financial relationships? No
I agree to comply with the American Public Health Association Conflict of Interest and Commercial Support Guidelines,
and to disclose to the participants any off-label or experimental uses of a commercial product or service discussed
in my presentation.
|