On Visual Studies I 707.031: Evaluation Methodology Winter 2015/16 Eduardo Veas
Research Projects • Augmented Data • RT assistance and instructions • record/replay instructions from an expert • assist non-expert with instructions • LOD for real-time instructions • Augmented Knowledge Spaces • Use space to organize and interact with technology • Investigate novel technologies. 2
The Human Vision How it works (when it does) 3
Model Human Processor Source: Card et al 1983 4
Perception vs. Cognition Perception Cognition • Eye, optical nerve, • Recognizing objects visual cortex • Relations between • Basic perception objects • First processing • Conclusion drawing • (edges, planes) • Problem solving • Not conscious • Learning • Reflexes 5
Model Human Processor (3): Perception • encodes input in a physical representation • stored in temp. visual / auditory memory • new frames in PM activate frames in WM and possibly in LTM • Unit percept: input faster than Tp combines 6
Human Visual System • 6.5 mio cones – dense in the center – 3 cone types (rgb) – 3 opponent color channels (bw, rg, by) • Fovea: 27 times the density – responsible for sharp central vision • 118.5 mio rods – black/white 7
Color perception Title Text 8
Color Reception, Composition • We have three distinct color receptors (cones) – three-dimensionality of the color space – that‘s why we have three primary colors – also evident in color models, e.g., RGB and CMY • Color composition – additive (e.g., RGB) • light • white: all three cones stimulated with same intensity, at high brightness – subtractive • pigment (e.g., CMYK) 9
Color Perception Blahblahblah……… 3 opponent color channels (bw, rg, by). There are 6 colors arranged perceptually as opponent pairs along 3 axes (Hering ’20): L = long, M = medium, S = short wavelength receptors 10
Simultaneous Brightness Contrast • The perceived brightness of an object is relative to it‘s background 11
Color • Color vision is irrelevant to much of normal vision! – does not help to perceive layout of objects – how they are moving – what shape they are • Color breaks camouflage (Tarnung) • Tells about material properties (judging quality of food) 12
Color Blindness • 10% of males, 1% of females (probably due to X- chromosomal recessive inheritance) • Most common: red-green weakness / blindness • Reason: lack of medium or long wavelength receptors, or altered spectral sensitivity (most common: green shift) 13 Normal Color Perception Deuteranopia (no green receptors) Protanopia (no red receptors)
Ishihara Color Blindness Test 14
Ishihara Color Blindness Test 15
Ishihara Color Blindness Test 16
Ishihara Color Blindness Test 17
Ishihara Color Blindness Test 18
Ishihara Color Blindness Test 19
Ishihara Color Blindness Test 20
Visual Perception A construction site 21
Human Visual System • 6.5 mio cones – dense in the center – 3 cone types (rgb) – 3 opponent color channels (bw, rg, by) • Fovea: 27 times the density – responsible for sharp central vision • 118.5 mio rods – black/white 22
Human Visual System • Vision: sequence of fixations and saccades – fixations: maintaining gaze on single location (200-600 ms) – saccades: moving between different locations (20-100 ms) • Vision not similar to a camera – More similar to a dynamic and ongoing construction project 23
What decides where we look? on inherited knowledge 24
Visual Saliency: perceptual selection • perceptual quality that makes some items in the world stand out from their neighbors (Itti) • bottom-up and top- down contributions 25
Computed Saliency vs Eye Tracked Attention 26
Feature Integration Reconstructing objects 27
Feature Integration Theory Density Clutter Noticeability Legibility Navigation Spatial interpretation PERCEPTUAL COGNITIVE PREATTENTIVE FOCUSED STATE ATTENTION IDENTIFY COMBINE PERCEIVE COMPARE PRIMITIVES PRIMITIVES OBJECT MEMORY MEMORY 28
Feature Integration Theory II 29
Preattentive Processing • Properties detected by the low-level visual system – very rapid – very accurate – processed in parallel • 200-250 milliseconds • Independent of the number of distractors! • Opposite: sequential search (processed serially) 30
Pre attentive features Fast attractors 31
Difference in Hue Examples online! 32
Difference in Curvature / Shape 33
Not Valid for Combinations • Conjunction Targets – no unique visual property • target: red, circle • distractor objects have both properties 34
Some Preattentive Properties orientation length closure size curvature density hue flicker direction of motion hue 35
Tasks • target detection – detect the presence or absence of a target • boundary detection – detect a texture boundary between two groups of elements, where all of the elements in each group have a common visual property • region tracking – track one or more elements with a unique visual feature as they move in time and space • counting and estimation: – users count or estimate the number of elements with a unique visual feature. 36
Tasks Boundary Detection Number Estimation 37
Hierarchy of Preattentive Features Examples online! 38
Theories of Preattentive Processing • Not known for sure how it works • Several theories: – http://www.csc.ncsu.edu/faculty/healey/PP/index.html 39
EYE TRACKING 40
Eye-Tracking • Measurement and study of eye movements 41
Functions of eye movement • get the fovea to the interesting information (fixations) • keep image stationary in spite of movements of the object or from the head (smooth pursuit) • prevent objects from perceptually fading (refresh, microsaccades) 42
Eye Tracking as Measure of Human Behavior • aim: identify patterns in the deployment of visual resources when performing a task • combining features into perception requires focus of attention • the more complicated, confusing or interesting, the longer it takes to “form a picture” 43
Tracked Eye Movements Task Typical mean fixation Mean saccade size duration (ms) (degrees) Silent Reading 225-250 2 (8-9 letter spaces) Oral Reading 275-325 1.5 (6-7 letter spaces) Scene Perception 260-330 4 Visual Search 180-275 3 44
Additional measurements: Eye Tracking • Fixation: dwell time on a given area • Saccade: quick movement of the eye between two points • Scan path: directed path of saccades and fixations • How to use these measurements? 45
Additional measurements: Eye Tracking Processing measures – Fixation count – Location of fixations – Duration of fixations – Cumulative fixation time – Cluster analysis (attention sinks) – AOI / normalized dividing by all fixations 46
Additional measurements: Eye Tracking Search measures • Scanpath length (distance between gaze point samples, ending at target) • Spatial density (distribution of gazepoints and scanpaths) • Number of saccades 47
Eye Tracking Experiments Title Text 48
Eye Tracking Experiment: Phase I 1. Learn about ET knowledge, find relevant previous ET studies 3. Research design: establish research question and analysis metrics 4. Pilot: [go to next page] and return to 1 if needed 49
Eye Tracking Experiment: Phase II • Prepare your data • Test hardware • Prepare room and setup • Pilot [return to draft?] • Execute • Calibration • Training • Calibration ? • Testing • Analyze 50
Eye Tracker Experiment: research questions • Noticeability: time till the first fixation on object of interest. • Attention: fixation duration, dwell time on the object of interest • Reaction: time between fixation and click 51
Eye Tracker Experiment: preparations • Setup testing environment • Design tasks and matched analysis metrics • Design study battery: (intro, calibration, training, [calibration], test, questionnaires) 52
Eye Tracker Calibration • Device • Characterize the user’s eyes • Match internal model (cornea, fovea) • User • Look and follow targets • Experimenter • keep cool and repeat when needed 53
Example:DAIMsVSM Title Text 54
How do we highlight objects in AR? 2 VEAS - MENDEZ - FEINER - SCHMALSTIEG
How do we highlight objects in AR? • CAN WE BE MORE SUBTLE? DIRECTING ATTENTION AND INFLUENCING 3 MEMORY WITH VISUAL SALIENCY MODULATION
Eye Tracking: Example Directing Attention and Influencing Memory with Visual Saliency Modulation [veas et al. 2011] • Characterize bottom-up attention: driven by exogenous cues. (100 ms ~ 250ms) • Measure deployment of attention before and after applying a SMT. • Measure deployment of memory after experiencing stimuli (modulated and not) 57
Recommend
More recommend