scholarly journals A Meta-Analysis of How Country-Level Factors Affect Web Survey Response Rates

2021 ◽  
pp. 147078532110509
Author(s):  
Jessica Daikeler ◽  
Henning Silber ◽  
Michael Bošnjak

A major challenge in web-based cross-cultural data collection is varying response rates, which can result in low data quality and non-response bias. Country-specific factors such as the political and demographic, economic, and technological factors as well as the socio-cultural environment may have an effect on the response rates to web surveys. This study evaluates web survey response rates using meta-analytical methods based on 110 experimental studies from seven countries. Three dependent variables, so-called effect sizes, are used: the web response rate, the response rate to the comparison survey mode, and the difference between the two response rates. The meta-analysis indicates that four country-specific factors (political and demographic, economic, technological, and socio-cultural) impact the magnitude of web survey response rates. Specifically, web surveys achieve high response rates in countries with high population growth, high internet coverage, and a high survey participation propensity. On the other hand, web surveys are at a disadvantage in countries with a high population age and high cell phone coverage. This study concludes that web surveys can be a reliable alternative to other survey modes due to their consistent response rates and are expected to be used more frequently in national and international settings.

2019 ◽  
Vol 8 (3) ◽  
pp. 513-539 ◽  
Author(s):  
Jessica Daikeler ◽  
Michael Bošnjak ◽  
Katja Lozar Manfreda

Abstract Do web surveys still yield lower response rates compared with other survey modes? To answer this question, we replicated and extended a meta-analysis done in 2008 which found that, based on 45 experimental comparisons, web surveys had an 11 percentage points lower response rate compared with other survey modes. Fundamental changes in internet accessibility and use since the publication of the original meta-analysis would suggest that people’s propensity to participate in web surveys has changed considerably in the meantime. However, in our replication and extension study, which comprised 114 experimental comparisons between web and other survey modes, we found almost no change: web surveys still yielded lower response rates than other modes (a difference of 12 percentage points in response rates). Furthermore, we found that prenotifications, the sample recruitment strategy, the survey’s solicitation mode, the type of target population, the number of contact attempts, and the country in which the survey was conducted moderated the magnitude of the response rate differences. These findings have substantial implications for web survey methodology and operations.


Author(s):  
Chatpong Tangmanee ◽  
Phattharaphong Niruttinanon

Researchers have increasingly adopted a web survey for data collection. Previous studies have examined factors leading to a web survey’s success. However, virtually no empirical work has examined the effects of the three levels of forced responses or the two styles of question items displayed on a web survey’s response rate. The current study attempted to fill this void. Using a quasi-experiment approach, we obtained 778 unique responses to six comparable web questionnaires of identical content. The analysis confirmed that (1) there were statistically significant differences across the surveys with the 100%-, 50%- and 0%-forced responses, and (2) there is not a significant difference between the response rates between surveys with scrolling and those with paging styles. In addition to extending the theoretical insight into factors contributing to a web survey’s response rate, the findings have offered recommendations to enhance the response rate in a web survey project.


Cephalalgia ◽  
2021 ◽  
pp. 033310242110181
Author(s):  
Florian Frank ◽  
Hanno Ulmer ◽  
Victoria Sidoroff ◽  
Gregor Broessner

Background The approval of monoclonal antibodies for prevention of migraine has revolutionized treatment for patients. Oral preventatives are still considered first line treatments as head-to-head trials comparing them with antibodies are lacking. Methods The main purpose of this study was to provide a comparative overview of the efficacy of three commonly prescribed migraine preventative medication classes. For this systematic review and meta-analysis, we searched the databases CENTRAL, EMBASE, and MEDLINE until 20 March 2020. We included RCTs reporting the 50% response rates for topiramate, Botulinum Toxin Type A and monoclonal antibodies against CGRP(r). Studies were excluded if response rates were not reported, treatment allocation was unclear, or if study quality was insufficient. Primary outcome measure were the 50% response rates. The pooled odds ratios with 95% confidence intervals were calculated with the random effects model. The study was registered at PROSPERO (CRD42020222880). Findings We identified 6552 reports. Thirty-two were eligible for our review. Studies assessing monoclonal antibodies included 13,302 patients and yielded pooled odds ratios for the 50% response rate of 2.30 (CI: 2.11–2.50). Topiramate had an overall effect estimate of 2.70 (CI: 1.97–3.69) with 1989 included patients and Botulinum Toxin Type A achieved 1.28 (CI: 0.98–1. 67) with 2472 patients included. Interpretation Topiramate, botulinum toxin type A and monoclonal antibodies showed higher odds ratios in achieving a 50% response rate compared to placebo. Topiramate numerically demonstrated the greatest effect size but also the highest drop-out rate.


2018 ◽  
Vol 37 (6) ◽  
pp. 750-765 ◽  
Author(s):  
Joseph W. Sakshaug ◽  
Basha Vicari ◽  
Mick P. Couper

Identifying strategies that maximize participation rates in population-based web surveys is of critical interest to survey researchers. While much of this interest has focused on surveys of persons and households, there is a growing interest in surveys of establishments. However, there is a lack of experimental evidence on strategies for optimizing participation rates in web surveys of establishments. To address this research gap, we conducted a contact mode experiment in which establishments selected to participate in a web survey were randomized to receive the survey invitation with login details and subsequent reminder using a fully crossed sequence of paper and e-mail contacts. We find that a paper invitation followed by a paper reminder achieves the highest response rate and smallest aggregate nonresponse bias across all-possible paper/e-mail contact sequences, but a close runner-up was the e-mail invitation and paper reminder sequence which achieved a similarly high response rate and low aggregate nonresponse bias at about half the per-respondent cost. Following up undeliverable e-mail invitations with supplementary paper contacts yielded further reductions in nonresponse bias and costs. Finally, for establishments without an available e-mail address, we show that enclosing an e-mail address request form with a prenotification letter is not effective from a response rate, nonresponse bias, and cost perspective.


2004 ◽  
Vol 95 (2) ◽  
pp. 432-434 ◽  
Author(s):  
Keith A. King ◽  
Jennifer L. Vaughan

This study examined whether survey response rate differed based on the color of the paper the survey was printed on (blue vs green) and presence of a monetary incentive. A 4-page survey on eating disorders was mailed to Division 1A and 1AA college head athletic trainers ( N = 223) with half of the surveys on blue paper and half on green paper. Half of the athletic trainers ( n = 111) received a $1.00 monetary incentive, and half ( n = 112) received no monetary incentive. A total of 166 (71%) athletic trainers returned completed surveys. Response rates did not differ based on survey color but did differ based on presence of a monetary incentive. Athletic trainers who received a monetary incentive were significantly more likely than those who did not to return completed surveys (86% vs 63%, respectively).


2020 ◽  
pp. 089443932091831 ◽  
Author(s):  
Fabian Kalleitner ◽  
Monika Mühlböck ◽  
Bernhard Kittel

Traditional survey research faces declining response rates due to changing cultural habits and technological developments. Researchers have developed novel approaches to increase respondents’ likelihood of participating in web surveys. However, we lack information about whether these methods indeed increase response rates and, if so, whether they bias the resulting data. This article focuses on the use of nonmaterial incentives in the form of a video that provides the invitees with information tailored to their life situation. Analysis of our experimental data shows that instead of increasing respondents’ probability of starting the survey, the video treatments actually decrease it. We provide evidence that the lower salience of the intrinsic benefits of survey participation in the invitation email significantly contributes to this reduction. Additionally, the effect of the nonmaterial incentive differs across subgroups, affecting nonresponse biases in line with employment status, gender, and migration background. We therefore conclude that using additional information in the form of a video as a nonmaterial survey incentive is only suitable under specific conditions.


1980 ◽  
Vol 17 (4) ◽  
pp. 498-502 ◽  
Author(s):  
Chris T. Allen ◽  
Charles D. Schewe ◽  
Gösta Wijk

A field experiment conducted in Sweden compared the effectiveness of two types of telephone pre-calls in influencing response rates in a mail survey. Response rates for a questioning foot-in-the-door manipulation were evaluated against responses produced by a simple solicitation call and a blind mailing control. The results demonstrate that pre-calling in general enhances response rate. However, the results furnish, at best, qualified support for a self-perception theory prediction. Alternative explanations for the lack of the self-perception foot effect are offered. Conclusions are drawn for the practitioner and academic researcher.


2003 ◽  
Vol 33 (1) ◽  
pp. 29-40 ◽  
Author(s):  
James Hartley ◽  
Andrew Rutherford

Many people have speculated over the last 80 years or so about the possibilities of using colored paper to boost response-rates to surveys and questionnaires, and several studies have been carried out. Most of these enquiries report no significant effects from using colored paper, although there have been some exceptions. In this investigation we pooled together the results from all of the experimental studies known to us on the topic and we carried out a meta-analysis to see if there might be a positive effect for colored paper overall. The results indicated that this was not the case, for we found no significant differences between the response rates to white and to colored paper in general. However, when we considered separately the most common colors used, it appeared that pink paper had the greatest effect.


2020 ◽  
Vol 30 (6) ◽  
pp. 1763-1781
Author(s):  
Louisa Ha ◽  
Chenjie Zhang ◽  
Weiwei Jiang

PurposeLow response rates in web surveys and the use of different devices in entering web survey responses are the two main challenges to response quality of web surveys. The purpose of this study is to compare the effects of using interviewers to recruit participants in computer-assisted self-administered interviews (CASI) vs computer-assisted personal interviews (CAPI) and smartphones vs computers on participation rate and web survey response quality.Design/methodology/approachTwo field experiments using two similar media use studies on US college students were conducted to compare response quality in different survey modes and response devices.FindingsResponse quality of computer entry was better than smartphone entry in both studies for open-ended and closed-ended question formats. Device effect was only significant on overall completion rate when interviewers were present.Practical implicationsSurvey researchers are given guidance how to conduct online surveys using different devices and choice of question format to maximize survey response quality. The benefits and limitations of using an interviewer to recruit participants and smartphones as web survey response devices are discussed.Social implicationsIt shows how computer-assisted self-interviews and smartphones can improve response quality and participation for underprivileged groups.Originality/valueThis is the first study to compare response quality in different question formats between CASI, e-mailed delivered online surveys and CAPI. It demonstrates the importance of human factor in creating sense of obligation to improve response quality.


Sign in / Sign up

Export Citation Format

Share Document