jenna fulton and rebecca medway joint program in survey
play

Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, - PowerPoint PPT Presentation

When More Gets You Less: A Meta-Analysis of the Effect of Concurrent Web Options on Mail Survey Response Rates Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, University of Maryland May 19, 2012 Background: Mixed-Mode


  1. When More Gets You Less: A Meta-Analysis of the Effect of Concurrent Web Options on Mail Survey Response Rates Jenna Fulton and Rebecca Medway Joint Program in Survey Methodology, University of Maryland May 19, 2012

  2. Background: Mixed-Mode Surveys • Growing use of mixed-mode surveys among practitioners • Potential benefits for cost, coverage, and response rate • One specific mixed-mode design – mail + Web – is often used in an attempt to increase response rates • Advantages: both are self-administered modes, likely have similar measurement error properties • Two strategies for administration: • “ Sequential ” mixed -mode • One mode in initial contacts, switch to other in later contacts • Benefits response rates relative to a mail survey • “ Concurrent ” mixed -mode • Both modes simultaneously in all contacts 2

  3. Background: Mixed-Mode Surveys • Growing use of mixed-mode surveys among practitioners • Potential benefits for cost, coverage, and response rate • One specific mixed-mode design – mail + Web – is often used in an attempt to increase response rates • Advantages: both are self-administered modes, likely have similar measurement error properties • Two strategies for administration: • “ Sequential ” mixed -mode • One mode in initial contacts, switch to other in later contacts • Benefits response rates relative to a mail survey • “ Concurrent ” mixed -mode • Both modes simultaneously in all contacts 3 • Mixed effects on response rates relative to a mail survey

  4. Methods: Meta-Analysis • Given mixed results in literature, we conducted a meta- analysis to: • Estimate effect of concurrent Web options on mail survey response rates • Evaluate whether study features influence size of effect • Search for studies • Searched journals and conference abstracts • Posted messages to AAPOR and ASA listservers • Eligible studies • Randomly assigned respondents to either • “mail - only” condition, or • “mode choice” condition (offered mail and web options concurrently ) • Both conditions: included same survey items and same incentive (if 4 offered), made all contacts by mail, and did not encourage response by a particular mode

  5. Methods: Effect size • Odds ratios (ORs) to quantify relationship between response rate in mail-only and mode choice conditions • Used response rate for mail-only condition as reference • To calculate overall OR: weighted study-level ORs by inverse of variance • Interpretation of ORs • OR < 1: adding a Web option to a mail survey has a negative impact on response rates • OR = 1: adding a Web option has no effect • OR > 1: adding a Web option has a positive impact 5

  6. Methods: Moderator Analyses • Used moderator analyses to determine if study characteristics impacted the magnitude of the effect size • Greater % respondents selecting Web Adding Web option would be • Young people as target population more effective for studies with • Published study these characteristics • Government sponsorship These characteristics would • Required participation increase motivation to • Incentive complete survey, reducing • Salient topic difference between conditions • Comprehensive Meta-Analysis software, random effects 6 model

  7. Results: Eligible Studies • Search produced 19 eligible experimental comparisons • All studies conducted during or after 2000 • Choice response rate lower than mail-only response rate for almost all comparisons Min Max Mean Median Mail-only sample size 139 212,072 17,547 1,107 Choice sample size 141 32,520 5,161 1,106 Mail-only response rate 16% 75% 51% 58% Choice response rate 15% 74% 48% 53% Proportion utilizing Web 4% 52% 17% 10% in choice condition 7

  8. Results: Effect Sizes • ORs ranged from 0.57 to 1.13 • 17 of 19 ORs less than 1.00 • 8 of 19 ORs significantly less than 1.00 • Only 1 OR significantly greater than 1.00 • Overall weighted OR was 0.87 ( p <0.001) • Providing a concurrent web option in mail surveys decreases the odds of response by 12.8% as compared to a mail-only survey. 8

  9. Results: Forest Plot of Effect Sizes Total (0.87) • The number in Schneider et al. (1.13) parentheses is the odds Brady et al. (1.01) ratio for each comparison. Lesser et al. (a) (0.96) Friese et al. (0.96) Turner et al. (0.93) • The dot for each Brogger et al. (0.93) Lesser et al. (b) (0.90) comparison represents Millar and Dillman (b) (0.89) the odds ratio value, while Gentry and Good (b) (0.87) the bar spans the 95% Millar and Dillman (a) (0.87) Werner and Forsman (0.84) confidence interval. Gentry and Good (a) (0.84) Israel (0.80) Griffin et al. (0.79) Hardigan et al. (0.72) Schmuhl et al. (0.72) Ziegenfuss et al. (0.70) Smyth et al. (0.66) Radon et al. (0.57) 9 .4 .6 .8 1 1.2 1.4 Odds Ratio

  10. Results: Moderator Analyses • None of the moderator analyses were significant at the 0.05 level. 1.00 Percent Age of Published Target Pop. Choosing Web 0.90 0.90 Odds Ratios 0.89 0.90 0.87 0.86 0.82 0.80 15%+ <15% Youth All ages No Yes 10 (n=7) (n=12) (n=5) (n=14) (n=8) (n=11)

  11. Results: Moderator Analyses • Surveys with government sponsor see smaller difference between mail-only and mode choice condition response rates (significant at 0.10 level) Government Required Topic Incentive 1.00 Sponsor Salience 0.96 Odds Ratios 0.92* 0.90 0.90 0.90 0.86 0.85 0.84 0.83 0.80 Yes No Yes No Yes No High Regular (n=10) (n=9) (n=3) (n=16) (n=9) (n=10) (n=12) (n=7) 11 * p <0.10

  12. Discussion • Across 19 experimental comparisons, we find that offering a concurrent Web option in a mail survey results in a significant reduction in the response rate. • As demonstrated by the moderator analyses, the study characteristics we examined largely do not influence the magnitude of the effect. • Potentially due to: small number of eligible studies; variation in design characteristics, sample sizes, and response rate calculations 12

  13. Discussion Three hypotheses to explain negative effect of concurrent Web options: 1. Making a choice between two modes • Increases complexity and burden of responding • Weighing pros and cons of each may cause both to appear less attractive 2. Replying by Web involves a break in the response process • Receive survey invitation in mail and likely open it as part of a larger task of sorting through and responding to mail • If choose to complete survey on the Web, they must transition to a different category of behavior. 3. Implementation problems with Web instrument • Sample members who attempt to complete survey online may 13 abandon effort due to frustration with computerized instrument or Internet connection

  14. Discussion • Our findings are only generalizable to specific type of concurrent Web option included in this meta-analysis. • Concurrent Web options also may be offered in mail surveys in other ways, such as: • Adding email or telephone contacts to the design • Offering incentives that are conditional on Web response. • Further research will need to be conducted to determine whether these types of designs can be used to increase response rates and improve research quality. 14

  15. Thank you! • For additional information: • JFulton@survey.umd.edu • RMedway@survey.umd.edu 15

  16. Additional Slides 16

  17. Discussion • Researchers may be interested in outcomes other than response rate – such as cost, nonresponse bias, timeliness, or data quality. • Reported only occasionally in studies included in this meta- analysis; as a result, we are not able to empirically evaluate effect of concurrent Web options on these outcomes. 17

Recommend


More recommend