scholarly journals Design and Analysis of Cognitive Interviews for Comparative Multinational Testing

Field Methods ◽  
2011 ◽  
Vol 23 (4) ◽  
pp. 379-396 ◽  
Author(s):  
Kristen Miller ◽  
Rory Fitzgerald ◽  
José-Luis Padilla ◽  
Stephanie Willson ◽  
Sally Widdop ◽  
...  

This article summarizes the work of the Comparative Cognitive Testing Workgroup, an international coalition of survey methodologists interested in developing an evidence-based methodology for examining the comparability of survey questions within cross-cultural or multinational contexts. To meet this objective, it was necessary to ensure that the cognitive interviewing (CI) method itself did not introduce method bias. Therefore, the workgroup first identified specific characteristics inherent in CI methodology that could undermine the comparability of CI evidence. The group then developed and implemented a protocol addressing those issues. In total, 135 cognitive interviews were conducted by participating countries. Through the process, the group identified various interpretive patterns resulting from sociocultural and language-related differences among countries as well as other patterns of error that would impede comparability of survey data.

Field Methods ◽  
2011 ◽  
Vol 23 (4) ◽  
pp. 362-378 ◽  
Author(s):  
Patricia L. Goerman ◽  
Matthew Clifton

Cognitive interviewing (CI) is a pretesting technique that elicits respondents’ interpretations of survey questions as a means to evaluate and revise them. Vignettes are sometimes used as a part of the cognitive testing method. There has been little research on using vignettes in the testing of survey translations. This article examines the use of vignettes in two Spanish and English pretesting projects at the U.S. Census Bureau. The authors examine findings across English and Spanish cases in the two studies and discuss areas for future research.


2021 ◽  
pp. 004912412110312
Author(s):  
Cornelia E. Neuert ◽  
Katharina Meitinger ◽  
Dorothée Behr

The method of web probing integrates cognitive interviewing techniques into web surveys and is increasingly used to evaluate survey questions. In a usual web probing scenario, probes are administered immediately after the question to be tested (concurrent probing), typically as open-ended questions. A second possibility of administering probes is in a closed format, whereby the response categories for the closed probes are developed during previously conducted qualitative cognitive interviews. Using closed probes has several benefits, such as reduced costs and time efficiency, because this method does not require manual coding of open-ended responses. In this article, we investigate whether the insights gained into item functioning when implementing closed probes are comparable to the insights gained when asking open-ended probes and whether closed probes are equally suitable to capture the cognitive processes for which traditionally open-ended probes are intended. The findings reveal statistically significant differences with regard to the variety of themes, the patterns of interpretation, the number of themes per respondent, and nonresponse. No differences in number of themes across formats by sex and educational level were found.


SAGE Open ◽  
2016 ◽  
Vol 6 (4) ◽  
pp. 215824401667177 ◽  
Author(s):  
Jennifer Edgar ◽  
Joe Murphy ◽  
Michael Keating

Cognitive interviewing is a common method used to evaluate survey questions. This study compares traditional cognitive interviewing methods with crowdsourcing, or “tapping into the collective intelligence of the public to complete a task.” Crowdsourcing may provide researchers with access to a diverse pool of potential participants in a very timely and cost-efficient way. Exploratory work found that crowdsourcing participants, with self-administered data collection, may be a viable alternative, or addition, to traditional pretesting methods. Using three crowdsourcing designs (TryMyUI, Amazon Mechanical Turk, and Facebook), we compared the participant characteristics, costs, and quantity and quality of data with traditional laboratory-based cognitive interviews. Results suggest that crowdsourcing and self-administered protocols may be a viable way to collect survey pretesting information, as participants were able to complete the tasks and provide useful information; however, complex tasks may require the skills of an interviewer to administer unscripted probes.


2021 ◽  
Vol 45 (1) ◽  
pp. 81-94
Author(s):  
Julie M. Maier ◽  
Kristen N. Jozkowski ◽  
Danny Valdez ◽  
Brandon L. Crawford ◽  
Ronna C. Turner ◽  
...  

Objectives: Salient belief elicitations (SBEs), informed by the Reasoned Action Approach (RAA), are used to identify 3 sets of beliefs – behavioral, control, and normative – that influence attitudes toward a health behavior. SBEs ask participants about their own beliefs through open-ended questions. We adapted a SBE by focusing on abortion, which is infrequently examined through SBEs; we also included a survey version that asked participants their views on what a hypothetical woman would do if contemplating an abortion. Given these deviations from traditional SBEs, the purpose of this study was to assess if the adapted SBE was understood by participants in English and Spanish through cognitive interviewing. Methods: We examined participants' interpretations of SBE items about abortion to determine if they aligned with the corresponding RAA construct. We administered SBE surveys and conducted cognitive interviews with US adults in both English and Spanish. Results: Participants comprehended the SBE questions as intended. Participants' interpretations of most questions were also in line with the respective RAA construct. Conclusions: SBE survey questions were comprehended well by participants. We discuss areas in which SBE questions can be modified to improve alignment with the underlying RAA construct to assess abortion beliefs.


Field Methods ◽  
2011 ◽  
Vol 23 (4) ◽  
pp. 331-341 ◽  
Author(s):  
Gordon B. Willis ◽  
Kristen Miller

Cognitive interviewing (CI) has emerged as a key qualitative method for the pretesting and evaluation of self-report survey questionnaires. This article defines CI, describes its key features, and outlines the data analysis techniques that are commonly used. The authors then consider recent extensions of cognitive testing to the cross-cultural survey research realm, where the major practical objectives are: (1) to facilitate inclusion of a range of cultural and linguistic groups and (2) for purposes of comparative analysis, to produce survey questionnaire items that exhibit comparability of measurement, across groups. Challenges presented by this extension to the cross-cultural and multilingual areas are discussed. Finally, the authors introduce the articles contained within the current special issue of Field Methods (2011), which endeavor to apply cognitive testing in specific cross-cultural survey projects, and to both identify and suggest solutions to the unique problems that face questionnaire designers and researchers more generally, in the practice of survey pretesting and evaluation methods as these endeavor to cover the sociocultural spectrum.


Methodology ◽  
2013 ◽  
Vol 9 (3) ◽  
pp. 123-128 ◽  
Author(s):  
Gordon Willis ◽  
Hennie Boeije

Based on the experiences of three research groups using and evaluating the Cognitive Interviewing Reporting Framework (CIRF), we draw conclusions about the utility of the CIRF as a guide to creating cognitive testing reports. Authors generally found the CIRF checklist to be usable, and that it led to a more complete description of key steps involved. However, despite the explicit direction by the CIRF to include a full explanation of major steps and features (e.g., research objectives and research design), the three cognitive testing reports tended to simply state what was done, without further justification. Authors varied in their judgments concerning whether the CIRF requires the appropriate level of detail. Overall, we believe that current cognitive interviewing practice will benefit from including, within cognitive testing reports, the 10 categories of information specified by the CIRF. Future use of the CIRF may serve to direct the overall research project from the start, and to further the goal of evaluation of specific cognitive interviewing procedures.


Sign in / Sign up

Export Citation Format

Share Document