While answering a survey questionnaire, how do the respondents interpret the questions? What techniques of cognitive interviewing can facilitate appropriate framing of the questions? These are important considerations to have while designing survey questions to ensure that they clearly communicate the question’s intent and elicit responses aligned with the intended interpretation.
To look at these concepts more closely, IFPRI—an implementing partner of DataDENT—and partners conducted the workshop, “Using Cognitive Interviewing to Improve Survey Questions: Bridging the gap between the intent and interpretation” on September 14, 2020, during the third India-focused implementation research conference “Delivering for Nutrition”. Participants learned qualitative techniques to conduct cognitive interviewing and how cognitive interviewing can increase data quality and study validity. The workshop featured three case studies of cognitive interviewing from nutrition, health, and gender fields.
What is the purpose of cognitive interviewing?
The aim of cognitive interviewing (CI) is to assess respondents’ understanding of survey questions and whether their responses reveal the information the researcher intends to capture. The theoretical underpinning of CI is motivated by cognitive errors that may occur at any point of the four stages of the cognitive process of answering a question. The four stages are: comprehension, retrieval, judgement, and response (Fig 1). We also explained how to draft open-ended cognitive questions to test for potential errors that may occur in any of these four stages. For instance, anticipated, spontaneous, conditional and/or emergent probes may be used to conduct cognitive interviewing—depending on the specific needs, constraints, and aim of the study in question.
Workshop Case Studies
The first case study was presented by Dr. Kerry Scott (Johns Hopkins Bloomberg School of Public Health) and her colleague Osama Ummer (Oxford Policy Management). They presented findings from a study on respectful maternity care in Madhya Pradesh, India. Kerry discussed the typology of survey problems ranging from length, sensitivity, and inappropriate vocabulary to failure of questions to resonate with respondents’ worldview and reality. Their findings cautioned the use of Likert scales due to poor interpretation of scales by respondents. A Likert scale with 5 response options was too broad for respondents; they could not differentiate between strongly agreeing and agreeing, for example. She raised pertinent questions on thinking about triangulating data through hybrid models of data collection, including qualitative interviews and different respondent types to understand multiple perspectives.
In the second case study, Sattvika Ashok (IFPRI) presented insights about using cognitive interviewing to assess survey questions pertaining to maternal and child nutrition interventions in Madhya Pradesh, India. She highlighted the patterns of challenges including misinterpretation of key terms, difficulty in understanding technical terms, temporal confusion and issues with questions containing multiple elements. The examples she provided demonstrated the value of CI in identifying plausible errors and improving survey questions, especially in large-scale household surveys.
In the third case study, Emily Myers (IFPRI) shared work from a CI conducted in Malawi. She demonstrated how to collect data using anticipated probes and offered guidance on analyzing and interpreting data. She explained that, in her experience, using Excel to organize verbatim responses facilitates qualitative coding and enables researchers to look for patterns in participants’ responses on the basis of sex, age, marital status, language group, and/or geographic area. Uncovering such patterns is key to revising survey items to develop a more robust questionnaire overall as they reveal how questions are interpreted by sub-groups of respondents, and therefore, which and how survey questions should be revised to enhance accuracy and validity.
There were interactive activity polls conducted on Slido. Participants raised several questions, including appropriate resource allocation for conducting CI, how Likert scales worked in the case studies presented, issues related to translation from English to local languages, etc. The presenters provided responses to these questions and pointed to further reading and resources about CI. They also pointed the participants to a Google Group for Cognitive Interviewing in Global Health (CIGH), a platform created to discuss an emerging and valuable method in survey design and research. To join this group, email Kerry at kerry.e.scott@gmail.com
Alignment with DataDENT activities
DataDENT’s overall aim is to strengthen the nutrition data value chain and strategic work to advance nutrition measurement. Supported by DataDENT, IFPRI conducted cognitive interviews to test mothers’ perceptions and understanding of a set of 25 survey questions including those proposed for DHS-8 on maternal and child nutrition intervention coverage. The workshop was designed to discuss cognitive interviewing as a critical component in survey design using case studies and experiences of researchers from the fields of nutrition, health and gender.