on visual studies i
play

On Visual Studies I 707.031: Evaluation Methodology Winter 2015/16 - PowerPoint PPT Presentation

On Visual Studies I 707.031: Evaluation Methodology Winter 2015/16 Eduardo Veas Research Projects Augmented Data RT assistance and instructions record/replay instructions from an expert assist non-expert with instructions LOD


  1. On Visual Studies I 707.031: Evaluation Methodology Winter 2015/16 Eduardo Veas

  2. Research Projects • Augmented Data • RT assistance and instructions • record/replay instructions from an expert • assist non-expert with instructions • LOD for real-time instructions • Augmented Knowledge Spaces • Use space to organize and interact with technology • Investigate novel technologies. 2

  3. The Human Vision How it works (when it does) 3

  4. Model Human Processor Source: Card et al 1983 4

  5. Perception vs. Cognition 
 Perception Cognition • Eye, optical nerve, • Recognizing objects visual cortex • Relations between • Basic perception objects • First processing • Conclusion drawing • (edges, planes) • Problem solving • Not conscious • Learning • Reflexes 5

  6. Model Human Processor (3): Perception • encodes input in a physical representation • stored in temp. visual / auditory memory • new frames in PM activate frames in WM and possibly in LTM • Unit percept: input faster than Tp combines 6

  7. Human Visual System • 6.5 mio cones – dense in the center – 3 cone types (rgb) – 3 opponent color channels (bw, rg, by) • Fovea: 27 times the density – responsible for sharp central vision • 118.5 mio rods – black/white 7

  8. Color perception Title Text 8

  9. Color Reception, Composition • We have three distinct color receptors (cones) – three-dimensionality of the color space – that‘s why we have three primary colors – also evident in color models, e.g., RGB and CMY • Color composition – additive (e.g., RGB) • light • white: all three cones stimulated 
 with same intensity, 
 at high brightness – subtractive • pigment (e.g., CMYK) 9

  10. Color Perception Blahblahblah……… 3 opponent color channels (bw, rg, by). There are 6 colors arranged perceptually as opponent pairs along 3 axes (Hering ’20): L = long, M = medium, S = short wavelength receptors 10

  11. Simultaneous Brightness Contrast • The perceived brightness of an object is relative to it‘s background 11

  12. Color • Color vision is irrelevant to much of normal vision! – does not help to perceive layout of objects – how they are moving – what shape they are • Color breaks camouflage (Tarnung) • Tells about material properties (judging quality of food) 12

  13. Color Blindness • 10% of males, 1% of females (probably due to X- chromosomal recessive inheritance) • Most common: red-green weakness / blindness • Reason: lack of medium or long wavelength receptors, or altered spectral sensitivity (most common: green shift) 13 Normal Color Perception Deuteranopia (no green receptors) Protanopia (no red receptors)

  14. Ishihara Color Blindness Test 14

  15. Ishihara Color Blindness Test 15

  16. Ishihara Color Blindness Test 16

  17. Ishihara Color Blindness Test 17

  18. Ishihara Color Blindness Test 18

  19. Ishihara Color Blindness Test 19

  20. Ishihara Color Blindness Test 20

  21. Visual Perception A construction site 21

  22. Human Visual System • 6.5 mio cones – dense in the center – 3 cone types (rgb) – 3 opponent color channels (bw, rg, by) • Fovea: 27 times the density – responsible for sharp central vision • 118.5 mio rods – black/white 22

  23. Human Visual System • Vision: sequence of fixations and saccades – fixations: maintaining gaze on single location (200-600 ms) – saccades: moving between different 
 locations (20-100 ms) • Vision not similar to a camera – More similar to a dynamic and 
 ongoing construction project 23

  24. What decides where we look? on inherited knowledge 24

  25. Visual Saliency: perceptual selection • perceptual quality that makes some items in the world stand out from their neighbors (Itti) • bottom-up and top- down contributions 25

  26. Computed Saliency vs Eye Tracked Attention 26

  27. Feature Integration Reconstructing objects 27

  28. Feature Integration Theory Density Clutter Noticeability Legibility Navigation Spatial interpretation PERCEPTUAL COGNITIVE PREATTENTIVE FOCUSED STATE ATTENTION IDENTIFY COMBINE PERCEIVE COMPARE PRIMITIVES PRIMITIVES OBJECT MEMORY MEMORY 28

  29. Feature Integration Theory II 29

  30. Preattentive Processing • Properties detected by the low-level visual system – very rapid – very accurate – processed in parallel • 200-250 milliseconds • Independent of the number of distractors! • Opposite: sequential search (processed serially) 30

  31. Pre attentive features Fast attractors 31

  32. Difference in Hue Examples online! 32

  33. Difference in Curvature / Shape 33

  34. Not Valid for Combinations • Conjunction Targets – no unique visual property • target: red, circle • distractor objects have both properties 34

  35. Some Preattentive Properties orientation length closure size curvature density hue flicker direction of motion hue 35

  36. Tasks • target detection – detect the presence or absence of a target • boundary detection – detect a texture boundary between two groups of elements, where all of the elements in each group have a common visual property • region tracking – track one or more elements with a unique visual feature as they move in time and space • counting and estimation: – users count or estimate the number of elements with a unique visual feature. 36

  37. Tasks Boundary Detection Number Estimation 37

  38. Hierarchy of Preattentive Features Examples online! 38

  39. Theories of Preattentive Processing • Not known for sure how it works • Several theories: – http://www.csc.ncsu.edu/faculty/healey/PP/index.html 39

  40. EYE TRACKING 40

  41. Eye-Tracking • Measurement and study of eye movements 41

  42. Functions of eye movement • get the fovea to the interesting information (fixations) • keep image stationary in spite of movements of the object or from the head (smooth pursuit) • prevent objects from perceptually fading (refresh, microsaccades) 42

  43. Eye Tracking as Measure of Human Behavior • aim: identify patterns in the deployment of visual resources when performing a task • combining features into perception requires focus of attention • the more complicated, confusing or interesting, the longer it takes to “form a picture” 43

  44. Tracked Eye Movements Task Typical mean fixation Mean saccade size duration (ms) (degrees) Silent Reading 225-250 2 (8-9 letter spaces) Oral Reading 275-325 1.5 (6-7 letter spaces) Scene Perception 260-330 4 Visual Search 180-275 3 44

  45. Additional measurements: Eye Tracking • Fixation: dwell time on a given area • Saccade: quick movement of the eye between two points • Scan path: directed path of saccades and fixations • How to use these measurements? 45

  46. Additional measurements: Eye Tracking Processing measures – Fixation count – Location of fixations – Duration of fixations – Cumulative fixation time – Cluster analysis (attention sinks) – AOI / normalized dividing by all fixations 46

  47. Additional measurements: Eye Tracking Search measures • Scanpath length (distance between gaze point samples, ending at target) • Spatial density (distribution of gazepoints and scanpaths) • Number of saccades 47

  48. Eye Tracking Experiments Title Text 48

  49. Eye Tracking Experiment: Phase I 1. Learn about ET knowledge, find relevant previous ET studies 3. Research design: establish research question and analysis metrics 4. Pilot: [go to next page] and return to 1 if needed 49

  50. Eye Tracking Experiment: Phase II • Prepare your data • Test hardware • Prepare room and setup • Pilot [return to draft?] • Execute • Calibration • Training • Calibration ? • Testing • Analyze 50

  51. Eye Tracker Experiment: research questions • Noticeability: time till the first fixation on object of interest. • Attention: fixation duration, dwell time on the object of interest • Reaction: time between fixation and click 51

  52. Eye Tracker Experiment: preparations • Setup testing environment • Design tasks and matched analysis metrics • Design study battery: (intro, calibration, training, [calibration], test, questionnaires) 52

  53. Eye Tracker Calibration • Device • Characterize the user’s eyes • Match internal model (cornea, fovea) • User • Look and follow targets • Experimenter • keep cool and repeat when needed 53

  54. Example:DAIMsVSM Title Text 54

  55. How do we highlight objects in AR? 2 VEAS - MENDEZ - FEINER - SCHMALSTIEG

  56. How do we highlight objects in AR? • CAN WE BE MORE SUBTLE? DIRECTING ATTENTION AND INFLUENCING 3 MEMORY WITH VISUAL SALIENCY MODULATION

  57. Eye Tracking: Example Directing Attention and Influencing Memory with Visual Saliency Modulation [veas et al. 2011] • Characterize bottom-up attention: driven by exogenous cues. (100 ms ~ 250ms) • Measure deployment of attention before and after applying a SMT. • Measure deployment of memory after experiencing stimuli (modulated and not) 57

Recommend


More recommend