survey nonresponse
Recently Published Documents


TOTAL DOCUMENTS

79
(FIVE YEARS 14)

H-INDEX

18
(FIVE YEARS 1)

2021 ◽  
Vol 37 (4) ◽  
pp. 931-953
Author(s):  
Corinna König ◽  
Joseph W. Sakshaug ◽  
Jens Stegmaier ◽  
Susanne Kohaut

Abstract Evidence from the household survey literature shows a declining response rate trend in recent decades, but whether a similar trend exists for voluntary establishment surveys is an understudied issue. This article examines trends in nonresponse rates and nonresponse bias over a period of 17 years in the annual cross-sectional refreshment samples of the IAB Establishment Panel in Germany. In addition, rich administrative data about the establishment and employee composition are used to examine changes in nonresponse bias and its two main components, refusal and noncontact, over time. Our findings show that response rates dropped by nearly a third: from 50.2% in 2001 to 34.5% in 2017. Simultaneously, nonresponse bias increased over this period, which was mainly driven by increasing refusal bias whereas noncontact bias fluctuated relatively evenly over the same period. Nonresponse biases for individual establishment and employee characteristics did not show a distinct pattern over time with few exceptions. Notably, larger establishments participated less frequently than smaller establishments over the entire period. This implies that survey organizations may need to put more effort into recruiting larger establishments to counteract nonresponse bias.


2021 ◽  
Author(s):  
Samantha Estrada

There is a wealth of literature on nonresponse bias, as well as sampling weights and other methods of assessing for survey nonresponse; however, there is little research in an applied setting such as higher education. Surveys administered to non-enrolled admitted students suffer from nonresponse; specifically, students who are not planning to enroll at a certain institution may be less likely to respond to the survey. In order to fill a gap in the literature, this study uses data from a higher education institution that utilizes the Confirmed and Regretted Admitted Students Questionnaire (CRASQ) to examine the effects of using sampling weights to correct for nonresponse biased.


2020 ◽  
Author(s):  
Devin Caughey ◽  
Adam J. Berinsky ◽  
Sara Chatfield ◽  
Erin Hartman ◽  
Eric Schickler ◽  
...  

Healthcare ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 100451
Author(s):  
Joseph A. Simonetti ◽  
Walter L. Clinton ◽  
Leslie Taylor ◽  
Alaina Mori ◽  
Stephan D. Fihn ◽  
...  

2020 ◽  
Author(s):  
Christoph Zangger

Patterns of nonresponse are often geographically correlated. Existing research has recognized this spatial dependence, using tools to map nonresponse or by means of nonresponse adjustments that draw on geographically aggregated auxiliary information to predict people's participation in surveys. Against this background, this contribution asks whether spatial econometric models can improve nonresponse adjustments and if they provide better propensity weights. Spatial models bear the advantage of an explicit modeling of the underlying spatial process by, for example, including the mutual dependence of neighboring units' response status. Results from Monte Carlo simulations demonstrate that they can indeed yield better estimates of population characteristics, especially if extreme weights are trimmed and when aggregated population shares of unobserved, spatially correlated confounders are available.


2020 ◽  
Vol 36 (3) ◽  
pp. 469-487 ◽  
Author(s):  
Annemieke Luiten ◽  
Joop Hox ◽  
Edith de Leeuw

AbstractFor more than three decades, declining response rates have been of concern to both survey methodologists and practitioners. Still, international comparative studies have been scarce. In one of the first international trend analyses for the period 1980–1997, De Leeuw and De Heer (2002) describe that response rates declined over the years and that countries differed in response rates and nonresponse trends. In this article, we continued where De Leeuw and De Heer (2002) stopped, and present trend data for the next period 1998–2015 from National Statistical Institutes. When we looked at trends over time in this new data set, we found that response rates are still declining over the years. Furthermore, nonresponse trends do differ over countries, but not over surveys. Some countries show a steeper decline in response than others, but all types of surveys show the same downward trend. The differences in (non)response trends over countries can be partly explained by differences in survey design between the countries. Finally, for some countries cost indicators were available, these showed that costs increased over the years and are negatively correlated with noncontact rates.


2020 ◽  
Vol 36 (1) ◽  
pp. 117-136
Author(s):  
Oliver Lipps ◽  
Marieke Voorpostel

AbstractInterviewers often assess after the interview the respondent’s ability and reluctance to participate. Prior research has shown that this evaluation is associated with next-wave response behavior in face-to-face surveys. Our study adds to this research by looking at this association in telephone surveys, where an interviewer typically has less information on which to base an assessment. We looked at next-wave participation, non-contact and refusal, as well as longer-term participation patterns. We found that interviewers were better able to anticipate refusal than non-contact relative to participation, especially in the next wave, but also in the longer term. Our findings confirm that interviewer evaluations – in particular of the respondent’s reluctance to participate – can help predict response at later waves, also after controlling for commonly used predictors of survey nonresponse. In addition to helping to predict nonresponse in the short term, interviewer evaluations provide useful information for a long-term perspective as well, which may be used to improve nonresponse adjustment and in responsive designs in longitudinal surveys.


2020 ◽  
Vol 243 ◽  
pp. 823-838
Author(s):  
Kerry Ratigan ◽  
Leah Rabin

AbstractHas survey nonresponse caused scholars to overestimate political trust in China? We analyse item nonresponse for sensitive questions on trust in government from our original survey of villagers conducted in China in 2012. We also analyse nonresponse in four other comparable surveys conducted in China between 1993 and 2014. We examine the association between nonresponse to politically sensitive questions and individual characteristics such as sex, level of education, Party membership and cosmopolitanism. We find that less privileged groups may be underrepresented in survey data generally. We find mixed results regarding the association between cosmopolitanism and nonresponse. We conclude that our understanding of political trust in China has been compromised by high rates of item nonresponse, leading to artificially high estimates of trust in the centre and exaggerated accounts of the gap between trust in central and local leaders.


Sign in / Sign up

Export Citation Format

Share Document