scholarly journals Effects of Forced Responses and Question Display Styles on Web Survey Response Rates

Author(s):  
Chatpong Tangmanee ◽  
Phattharaphong Niruttinanon

Researchers have increasingly adopted a web survey for data collection. Previous studies have examined factors leading to a web survey’s success. However, virtually no empirical work has examined the effects of the three levels of forced responses or the two styles of question items displayed on a web survey’s response rate. The current study attempted to fill this void. Using a quasi-experiment approach, we obtained 778 unique responses to six comparable web questionnaires of identical content. The analysis confirmed that (1) there were statistically significant differences across the surveys with the 100%-, 50%- and 0%-forced responses, and (2) there is not a significant difference between the response rates between surveys with scrolling and those with paging styles. In addition to extending the theoretical insight into factors contributing to a web survey’s response rate, the findings have offered recommendations to enhance the response rate in a web survey project.

1979 ◽  
Vol 16 (3) ◽  
pp. 429-431 ◽  
Author(s):  
Terry L. Childers ◽  
O. C. Ferrell

A 2 × 2 factorial experiment was designed to test the impact on mail survey response rate resulting from variations in paper trim size and number of printed pages in the questionnaire. ANOVA findings suggest 8½ × 11″ paper trim size produces a better response rate than an 8½ × 14″ paper trim size. Use of a one-sheet (front and back) versus a two-sheet (front only) questionnaire did not cause a significant difference in response rate; a hypothesized interaction effect was not found to be statistically significant.


2021 ◽  
pp. 147078532110509
Author(s):  
Jessica Daikeler ◽  
Henning Silber ◽  
Michael Bošnjak

A major challenge in web-based cross-cultural data collection is varying response rates, which can result in low data quality and non-response bias. Country-specific factors such as the political and demographic, economic, and technological factors as well as the socio-cultural environment may have an effect on the response rates to web surveys. This study evaluates web survey response rates using meta-analytical methods based on 110 experimental studies from seven countries. Three dependent variables, so-called effect sizes, are used: the web response rate, the response rate to the comparison survey mode, and the difference between the two response rates. The meta-analysis indicates that four country-specific factors (political and demographic, economic, technological, and socio-cultural) impact the magnitude of web survey response rates. Specifically, web surveys achieve high response rates in countries with high population growth, high internet coverage, and a high survey participation propensity. On the other hand, web surveys are at a disadvantage in countries with a high population age and high cell phone coverage. This study concludes that web surveys can be a reliable alternative to other survey modes due to their consistent response rates and are expected to be used more frequently in national and international settings.


2020 ◽  
Author(s):  
Elise Braekman ◽  
Stefaan Demarest ◽  
Rana Charafeddine ◽  
Sabine Drieskens ◽  
Finaba Berete ◽  
...  

BACKGROUND Potential is seen in web data collection for population health surveys due to a combination of its cost-effectiveness, implementation ease and the increased internet penetration. Nonetheless, web modes may lead to lower and more selective unit response rates than traditional modes and hence may increase bias in the measured indicators. OBJECTIVE This research assesses the unit response and costs of a web versus F2F study. METHODS Alongside the F2F Belgian Health Interview Survey of 2018 (BHIS2018; n gross sample used: 7,698), a web survey (BHISWEB; n gross sample=6,183) is organized. Socio-demographic data on invited individuals is obtained from the national register and census linkages. Unit response rates considering the different sampling probabilities of both surveys are calculated. Logistic regression analyses examine the association between mode system (web vs. F2F) and socio-demographic characteristics on unit non-response. The costs per completed web questionnaire are compared with these for a completed F2F questionnaire. RESULTS The unit response rate is lower in BHISWEB (18.0%) versus BHIS2018 (43.1%). A lower web response is found among all socio-demographic groups, however, the difference is higher among people older than 65, low educated people, people with a non-Belgian nationality, people living alone and these living in Brussels Capital. Not the same socio-demographic characteristics are associated with non-response in both studies. Having another European (OR (95% CI): 1.60 (1.20-2.13)) or a non-European nationality (OR (95% CI): 2.57 (1.79-3.70)) (compared to having the Belgian nationality) and living in the Brussels Capital (95% CI): 1.72 (1.41-2.10)) or Walloon (OR (95% CI): 1.47 (1.15 - 1.87) region (compared to living in the Flemish region) is only in BHISWEB associated with a higher non-response. In BHIS2018 younger people (OR (95% CI): 1.31 (1.11-1.54)) are more likely to be non-respondent than older people, this was not found BHISWEB. In both studies, lower educated people have a higher change to be non-respondent, but this effect is more pronounced in BHISWEB (OR low vs. high education level (95% CI): Web 2.71 (2.21-3.39)); F2F 1.70 (1.48-1.95)). The BHISWEB study has a considerable cost advantage; the total cost per completed questionnaire is almost three times lower (€41) compared to the F2F data collection (€111). CONCLUSIONS The F2F unit response rate is generally higher, yet for certain groups the difference between web versus F2F is more limited. A considerable cost advantage of web collection is found. It is therefore worthwhile to experiment with adaptive mixed-mode designs to optimize financial resources without increasing selection bias; e.g. only inviting socio-demographic groups more eager to participate online for web surveys while remaining to focus on increasing the F2F response rates for other groups. CLINICALTRIAL Studies approved by the Ethics Committee of the University hospital of Ghent


2004 ◽  
Vol 94 (2) ◽  
pp. 444-448 ◽  
Author(s):  
James H. Price ◽  
Faith Yingling ◽  
Eileen Walsh ◽  
Judy Murnan ◽  
Joseph A. Dake

This study assessed differences in response rates to a series of three-wave mail surveys when amiable or insistently worded postcards were the third wave of the mailing. Three studies were conducted; one with a sample of 600 health commissioners, one with a sample of 680 vascular nurses, and one with 600 elementary school secretaries. The combined response rates for the first and second wave mailings were 65.8%, 67.6%, and 62.4%, respectively. A total of 308 amiable and 308 insistent postcards were sent randomly to nonrespondents as the third wave mailing. Overall, there were 41 amiable and 52 insistent postcards returned, not significantly different by chi-square test. However, a separate chi-square test for one of the three studies, the nurses' study, did find a significant difference in favor of the insistently worded postcards.


2004 ◽  
Vol 95 (2) ◽  
pp. 432-434 ◽  
Author(s):  
Keith A. King ◽  
Jennifer L. Vaughan

This study examined whether survey response rate differed based on the color of the paper the survey was printed on (blue vs green) and presence of a monetary incentive. A 4-page survey on eating disorders was mailed to Division 1A and 1AA college head athletic trainers ( N = 223) with half of the surveys on blue paper and half on green paper. Half of the athletic trainers ( n = 111) received a $1.00 monetary incentive, and half ( n = 112) received no monetary incentive. A total of 166 (71%) athletic trainers returned completed surveys. Response rates did not differ based on survey color but did differ based on presence of a monetary incentive. Athletic trainers who received a monetary incentive were significantly more likely than those who did not to return completed surveys (86% vs 63%, respectively).


1980 ◽  
Vol 17 (4) ◽  
pp. 498-502 ◽  
Author(s):  
Chris T. Allen ◽  
Charles D. Schewe ◽  
Gösta Wijk

A field experiment conducted in Sweden compared the effectiveness of two types of telephone pre-calls in influencing response rates in a mail survey. Response rates for a questioning foot-in-the-door manipulation were evaluated against responses produced by a simple solicitation call and a blind mailing control. The results demonstrate that pre-calling in general enhances response rate. However, the results furnish, at best, qualified support for a self-perception theory prediction. Alternative explanations for the lack of the self-perception foot effect are offered. Conclusions are drawn for the practitioner and academic researcher.


2013 ◽  
Vol 29 (2) ◽  
pp. 261-276 ◽  
Author(s):  
Katherine A. McGonagle ◽  
Robert F. Schoeni ◽  
Mick P. Couper

Abstract Since 1969, families participating in the U.S. Panel Study of Income Dynamics (PSID) have been sent a mailing asking them to update or verify their contact information in order to keep track of their whereabouts between waves. Having updated contact information prior to data collection is associated with fewer call attempts, less tracking, and lower attrition. Based on these advantages, two experiments were designed to increase response rates to the between wave contact mailing. The first experiment implemented a new protocol that increased the overall response rate by 7-10 percentage points compared to the protocol in place for decades on the PSID. This article provides results from the second experiment which examines the basic utility of the between-wave mailing, investigates how incentives affect article cooperation to the update request and field effort, and attempts to identify an optimal incentive amount. Recommendations for the use of contact update strategies in panel studies are made.


2017 ◽  
Author(s):  
Ryuhei So ◽  
Kiyomi Shinohara ◽  
Takuya Aoki ◽  
Yasushi Tsujimoto ◽  
Aya M Suganuma ◽  
...  

BACKGROUND Low participation rates are one of the most serious disadvantages of Web-based studies. It is necessary to develop effective strategies to improve participation rates to obtain sufficient data. OBJECTIVE The objective of this trial was to investigate the effect of emphasizing the incentive in the subject line of the invitation email and the day of the week of sending the invitation email on the participation rate in a Web-based trial. METHODS We conducted a 2×2 factorial design randomized controlled trial. We contacted 2000 primary care physicians from members of the Japan Primary Care Association in January 2017 and randomly allocated them to 1 of 4 combinations of 2 subject lines (presence or absence of an emphasis on a lottery for an Amazon gift card worth 3000 yen or approximately US $30) and 2 delivery days (sending the invitation email on Tuesday or Friday). The primary outcome was the response rate defined as the number of participants answering the first page of the questionnaire divided by the number of invitation emails delivered. All outcomes were collected between January 17, 2017, and February 8, 2017. RESULTS We analyzed data from 1943 out of 2000 participants after excluding those whose email addresses were invalid. The overall response rate was 6.3% (123/1943). There was no significant difference in the response rates between the 2 groups regarding incentive in the subject line: the risk ratio was 1.12 (95% CI 0.80 to 1.58) and the risk difference was 0.7% (95% CI –1.5% to 2.9%). Similarly, there was no significant difference in the response rates between the 2 groups regarding sending the email on Tuesday or Friday: the risk ratio was 0.98 (95% CI 0.70 to 1.38) and the risk difference was –0.1% (95% CI –2.3% to 2.1%). CONCLUSIONS Neither emphasizing the incentive in the subject line of the invitation email nor varying the day of the week the invitation email was sent led to a meaningful increase in response rates in a Web-based trial with primary care physicians. CLINICALTRIAL University Hospital Medical Information Network Clinical Trials Registry UMIN000025317; https://upload.umin.ac.jp/cgi-open-bin/ctr_e/ctr_view.cgi?recptno=R000029121 (Archived by WebCite at http://www.webcitation. org/6wOo1jl9t)


2016 ◽  
Vol 9 (1) ◽  
pp. 1-10 ◽  
Author(s):  
Jared Coopersmith ◽  
Lisa Klein Vogel ◽  
Timothy Bruursema ◽  
Kathleen Feeney

Sign in / Sign up

Export Citation Format

Share Document