cognition in visual processing
play

Cognition in Visual Processing 707.031: Evaluation Methodology - PowerPoint PPT Presentation

Cognition in Visual Processing 707.031: Evaluation Methodology Winter 2015/16 Eduardo Veas Research Projects @ KTI email Eduardo Connected world build connected coffee machine build sensing and intelligence into appliances


  1. Cognition in Visual Processing 707.031: Evaluation Methodology Winter 2015/16 Eduardo Veas

  2. Research Projects @ KTI email Eduardo • Connected world • build connected coffee machine • build sensing and intelligence into appliances • Augmented Data • how can we augment the real world with data? • investigate different display devices • investigate different visual techniques • Augmented Knowledge Spaces • Use space to organize and interact with technology • Use natural mobility to interact with augmentations 2

  3. Project Topics • Glove Study: Granit, Arbenore ,Santokh, Millot • AR Signs Study: Eduardo, Santokh, Rene, Millot • Collection Study: Cecilia, • VisRec Study: Belgin, Millot, Santokh • AR navigation study: 3

  4. Model Human Processor Source: Card et al 1983 4

  5. Perception vs. Cognition 
 Perception Cognition • Eye, optical nerve, • Recognizing objects visual cortex • Relations between • Basic perception objects • First processing • Conclusion drawing • (edges, planes) • Problem solving • Not conscious • Learning • Reflexes 5

  6. Shannon’s Information Theorem for Vis 6

  7. Visual Hypotheses • Separability vs • Grouping Integrality – What perceptual factors help me group or categorize visual stimuli? 7

  8. Gestalt Laws • Understand Pattern Perception • Westheimer, Koffka, Kohler 1912 Gestalt School of Psychology • Reasons they gave wrong, • OBSERVATIONS CORRECT • Proximity • Symmetry • Similarity • Closure • Connectedness • Relative Size • Continuity • Common Fate 8

  9. Gestalt Laws The whole is greater than the sum of the parts! 9

  10. Proximity 10

  11. Proximity Example 11

  12. Similarity 12

  13. Connectedness 13

  14. Continuity 14

  15. Symmetry 15

  16. Closure 16

  17. Closure Example 17

  18. Relative Size Same size 18

  19. Figure & Ground 19

  20. Common Fate 20

  21. Institut für Maschinelles Sehen und Darstellen Data Attributes (scales, levels) • Qualitative • Nominal / Categorical • Categories with no ordering (gender, nationality, blood type, etc). • Ordinal • Ordered categories (grades, ranks, etc). • Quantitative • Interval • distance between entities matters, but not ratios, arbitrary 0 • temperature in Celsius or Fahrenheit, time, etc. • Ratio • Meaningful 0 (weight, age, length, temperature in Kelvin, etc). • Absolute • Count (number of students, number of lines of code etc). Eduardo Veas Evaluation-Validation ‹#›

  22. Institut für Maschinelles Sehen und Darstellen Visual Variables:”Based on Bertin‘s Visual Variables, 1967, S. Carpendale's discussion, and Munzner categorization of marks and channels. VISUAL VARIABLES Eduardo Veas Evaluation-Validation ‹#›

  23. Institut für Maschinelles Sehen und Darstellen Representation • How data is understood depends on it‘s representation • Different representations have different benefits or deficiencies • Example: number thirty-four – Decimal: 34 – Roman: XXXIV – 100010 • Degree of accessibility varies � abstraction required Eduardo Veas Evaluation-Validation ‹#›

  24. Institut für Maschinelles Sehen und Darstellen Visual Encoding Propose a combination of visual variables (marks and channels) to describe data dimensions Eduardo Veas Evaluation-Validation ‹#›

  25. Visual Variables 25

  26. Characteristics of Visual Variables • Selective – Is a mark distinct from other marks? • Associative – Marks with associative visual variables can be perceived as a group. • Quantitative – Relationship between two marks can be seen as numerical. • Order – A change in an ordered visual variable will be perceived as more or less. • Length – The number of changes that can be used (often perception influenced, jnd) 26

  27. Institut für Maschinelles Sehen und Darstellen Position Eduardo Veas Evaluation-Validation ‹#›

  28. Institut für Maschinelles Sehen und Darstellen Size Eduardo Veas Evaluation-Validation ‹#›

  29. Institut für Maschinelles Sehen und Darstellen Shape Eduardo Veas Evaluation-Validation ‹#›

  30. Institut für Maschinelles Sehen und Darstellen Select a shape? Eduardo Veas Evaluation-Validation ‹#›

  31. Institut für Maschinelles Sehen und Darstellen Value Eduardo Veas Evaluation-Validation ‹#›

  32. Institut für Maschinelles Sehen und Darstellen Select Value! Eduardo Veas Evaluation-Validation ‹#›

  33. Institut für Maschinelles Sehen und Darstellen Color Eduardo Veas Evaluation-Validation ‹#›

  34. Institut für Maschinelles Sehen und Darstellen Select Color! Eduardo Veas Evaluation-Validation ‹#›

  35. Institut für Maschinelles Sehen und Darstellen Orientation Eduardo Veas Evaluation-Validation ‹#›

  36. Institut für Maschinelles Sehen und Darstellen Effectiveness of visual variables (Mackinlay 88) Decreasing [Schumann 2000 nach Macinlay 1986] Eduardo Veas Evaluation-Validation ‹#›

  37. Perception 
 Depth perception 37

  38. Today‘s Agenda • Visual Variables – Data Scales – Data Representation – Bertin‘s Visual Variables • Depth Perception – Occulomotor cues – Pictorial cues 38

  39. Visual Perception Theories • Gregory (1970) • Gibson (1966) • Top-down • Bottom-up • Direct perception • Perception is • Sensory information constructive analyzed in a pipeline • Perceptual hypotheses • No need for previous • 90% sensory knowledge information is lost in the visual pathway 39

  40. Bottom up Perception: components I • Light and environment: optic flow. 40

  41. Bottom up perception components II • Invariants and texture 41

  42. Bottom up Perception: components III • Optical array • Relative brightness • Texture gradient • affordances • Relative size • Superimposition • Height in the visual field. 42

  43. Depth Perception: Cues • Occulomotor – Convergence – Accomodation 43

  44. Depth Perception: Cues • Binocular – Stereopsis/binocular disparity 44

  45. Depth Perception: Cues • Motion Based (video) – Motion parallax – Kinetic depth field b a c 45

  46. Structure from motion • Kinetic Depth Effect • Assumption of rigidity allows us to assume shape as objects move/rotate a c 46

  47. PICTORIAL DEPTH CUES 47

  48. Perspective cues • Parallel lines converge • Distant objects appear smaller • Textured Elements become smaller with distance 48

  49. Perspective and vanishing points 49

  50. Occlusion • The strongest depth cue • Stereopsis occlusion different for each eye 50

  51. Depth of focus • Strong Depth Cue • Must be coupled with user input (e.g. point of fixation) 51

  52. Shadows • Important cue for height of an object above a plane • An indirect depth cue 52

  53. Atmospheric depth • Reduction in contrast of distant objects • Exaggerated in 3D displays using what is called proximity luminance covariance. 53

  54. Depth Perception: Pictorial Cues 54

  55. Depth Perception: Cues Summary 55

  56. Illustration techniques Emphasizing height, width or volume 56

  57. Visualization Tasks • Judging 3D surfaces • Relative Position in 3D space • Finding 3D patterns of points • Judging relative movement of particles • Judging self movement • Feeling a “sense of presence” 57

  58. Depth Perception: Volume rendering 58

  59. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality Livingston, Ai, Swan, Smallman 59

  60. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality Goals • test depth perception indoors and outdoors • test AR based linear perspective 60

  61. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality Task • eight real referents identified by color • colored target appeared at a random depth • participant moved target until it matches corresponding referent 61

  62. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality Independent Variables Dependent Variables • Environment {indoor, • Distance off outdoor} • Time to complete • Tramlines {on, off} • Nasa TLX • Grid points {on, off} • Subjective questions • Distance {3.83, 9.66…} • Repetition {1,2,3,4,5} 62

  63. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality Hypotheses • outdoor environment results in greater error • tramlines and gridpoints increase precision • increasing errors and decreasing precision with distance 63

  64. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality Results: Effect of environment on normalized error. 64

  65. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality Results: Effect of repetition on normalized error. 65

  66. Indoor vs. Outdoor Depth Perception for Mobile Augmented Reality • No effect of Grid or Tramlines • No difference in NASA TLX measures. 66

  67. SITUATION AWARENESS AND MENTAL LOAD 67

  68. Situational awareness SITUATION AWARENESS Perception of Comprehension Projection of elements in the of current future status environment situation LEVEL 3 LEVEL 1 LEVEL 2 68

Recommend


More recommend