The Turnout Gap in Surveys: Explanations and Solutions

2018 ◽  
Vol 49 (4) ◽  
pp. 1133-1162 ◽  
Author(s):  
Matthew DeBell ◽  
Jon A. Krosnick ◽  
Katie Gera ◽  
David S. Yeager ◽  
Michael P. McDonald

Postelection surveys regularly overestimate voter turnout by 10 points or more. This article provides the first comprehensive documentation of the turnout gap in three major ongoing surveys (the General Social Survey, Current Population Survey, and American National Election Studies), evaluates explanations for it, interprets its significance, and suggests means to continue evaluating and improving survey measurements of turnout. Accuracy was greater in face-to-face than telephone interviews, consistent with the notion that the former mode engages more respondent effort with less social desirability bias. Accuracy was greater when respondents were asked about the most recent election, consistent with the hypothesis that forgetting creates errors. Question wordings designed to minimize source confusion and social desirability bias improved accuracy. Rates of reported turnout were lower with proxy reports than with self-reports, which may suggest greater accuracy of proxy reports. People who do not vote are less likely to participate in surveys than voters are.

2022 ◽  
Vol 19 (1) ◽  
Author(s):  
Katja Isaksen ◽  
Ingvild Sandøy ◽  
Joseph Zulu ◽  
Andrea Melberg ◽  
Sheena Kabombwe ◽  
...  

Abstract Background Numerous studies have documented inconsistent reporting of sexual behaviour by adolescents. The validity and reliability of self-reported data on issues considered sensitive, incriminating or embarrassing, is prone to social-desirability bias. Some studies have found that Audio Computer-Assisted Self Interviewing (ACASI) that removes the personal interaction involved in face-to-face interviews, decreases item non-response and increases reporting of sensitive behaviours, but others have found inconsistent or contradictory results. To reduce social desirability bias in the reporting of sensitive behaviours, face-to-face interviews were combined with ACASI in a cluster randomized trial involving adolescents in Zambia. Methods To explore adolescent girls’ experiences and opinions of being interviewed about sexual and reproductive health, we combined Focus Group Discussions with girl participants and individual semi-structured interviews with teachers. This study was done after the participants had been interviewed for the 6th time since recruitment. Young, female research assistants who had conducted interviews for the trial were also interviewed for this study. Results Respondents explained often feeling shy, embarrassed or uncomfortable when asked questions about sex, pregnancy and abortion face-to-face. Questions on sexual activity elicited feelings of shame, and teachers, research assistants and girls alike noted that direct questions about sexual activities limit what the participant girls may be willing to share. Responding to more indirect questions in relation to the context of a romantic relationship was slightly easier. Efforts by interviewers to signal that they did not judge the participants for their behavior and increased familiarity with the interviewer reduced discomfort over time. Although some appreciated the opportunity to respond to questions on their own, the privacy offered by ACASI also provided an opportunity to give false answers. Answering on tablets could be challenging, but participants were reluctant to ask for assistance for fear of being judged as not conversant with technology. Conclusion Strategies to avoid using overly direct language and descriptive words, asking questions within the context of a romantic relationship and a focus on establishing familiarity and trust can reduce reporting bias. For the use of ACASI, considerations must be given to the context and characteristics of the study population.


Author(s):  
Mary Kay Gugerty ◽  
Dean Karlan

Without high-quality data, even the best-designed monitoring and evaluation systems will collapse. Chapter 7 introduces some the basics of collecting high-quality data and discusses how to address challenges that frequently arise. High-quality data must be clearly defined and have an indicator that validly and reliably measures the intended concept. The chapter then explains how to avoid common biases and measurement errors like anchoring, social desirability bias, the experimenter demand effect, unclear wording, long recall periods, and translation context. It then guides organizations on how to find indicators, test data collection instruments, manage surveys, and train staff appropriately for data collection and entry.


2021 ◽  
pp. 1-18
Author(s):  
Endra Iraman ◽  
Yoshikuni Ono ◽  
Makoto Kakinaka

Abstract Identifying taxpayers who engage in noncompliant behaviour is crucial for tax authorities to determine appropriate taxation schemes. However, because taxpayers have an incentive to conceal their true income, it is difficult for tax authorities to uncover such behaviour (social desirability bias). Our study mitigates the bias in responses to sensitive questions by employing the list experiment technique, which allows us to identify the characteristics of taxpayers who engage in tax evasion. Using a dataset obtained from a tax office in Jakarta, Indonesia, we conducted a computer-assisted telephone interviewing survey in 2019. Our results revealed that 13% of the taxpayers, old, male, corporate employees, and members of a certain ethnic group had reported lower income than their true income on their tax returns. These findings suggest that our research design can be a useful tool for understanding tax evasion and for developing effective taxation schemes that promote tax compliance.


Sign in / Sign up

Export Citation Format

Share Document