Using E Eye-Trac ackin ing t to E Evalu luate N New Americ ican an Community S Survey Mail M ail Material al Desig ign Strateg egies Alfred “Dave” Tuttle, Jon Schreiner, Elizabeth Nichols, Erica Olmsted-Hawala, and Rodney Terry U. S. Census Bureau American Association for Public Opinion Research Conference, May 17, 2019 Any views expressed are those of the authorsand not necessarily those of the U.S. Census Bureau. 1 CBDRB-FY19-CED001-B0006
Motivation • Evaluation of new letters prior to use in a survey experiment • How do the visual design strategies affect reading behaviors? • Are the new designs counterproductive? • Eye-tracking used to try to answer these questions 2
Eye-tracking procedures • Four letters • Each letter shown individually • “Please look at it as you would at home” • Four treatments, random assignment to vary order of presentation 3
Sample • n=9 • Recruiting database • Purposive sample • Higher vs. lower education • Bachelors and beyond • Less than bachelors • Demographic variety – race/ethnicity, age, gender • ~30 minutes; $40 incentive 4
Test materials 5
6
7
8
9
10
11
Research questions • How do visual elements (color, images, typographic cues) affect reading behavior? • Do they draw attention to important information? • Do they distract respondents from seeing important information? • How do people read survey letters? • Systematic, skimming, non-linear? • Pre-attentive processing 12
Results 13
Eye-tracking output • Videos and static images • Gaze plots and heat maps • Individual respondents • Multiple respondents • Not covering: • Quantitative metrics • Recall test results 14
Typical reading patterns (2X speed) 15
Atypical (?) reading pattern (2X speed) 16
All respondents (n=9), normal speed 17
18
Pre-attentive processing 19
Systematic reading • “Do I have to respond?” • “…your response is required by law…” 20
Systematic reading • “…respond now…” 21
Takeaways • People are noticing important information: • Census Bureau, American Community Survey • Call to action – Respond now, URL • Response is required by law • Visual cues were effective at drawing attention • Color • Bold print • FAQ/list format, sidebar • Simpler Census Bureau letterhead • During pre-attentive processing and during systematic reading • Icons, images? More research needed • Prevalence of systematic reading • Do respondents “act natural” in a lab? • Letters are familiar format 22
Next steps • Additional interviews (total of 20) • Quantitative analysis of eye-tracking data • Cognitive testing of mail packages • Clarity of language, impressions • How Rs open and process survey materials • Survey experiment 23
Comments, questions? Suggestions for facilitating more “normal” survey letter processing behaviors in lab setting? Dave Tuttle alfred.d.tuttle@census.gov 24
Recommend
More recommend