Reminder Ted’s talk • Ted Selker – “what is a human computer input sensor?” • 2.15 pm, BU101 Öttingenstrasse 67 1 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Technologies context and task theory interaction techniques in/output technologies 2 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Taxonomy of Gesture styles context and task • sign language • gesticulation theory –communicative gestures made in conjunction bimanual with speech interaction –know how your users gesture naturally and pointing design artificial gestures that have no cross-talk with natural gesturing gestures interaction techniques in/output technologies http://thomas.baudel.name/Morphologie/These/images/VI11.gif Literature: Baudel et al. Charade: remote control of objects using free-hand gestures, Communications of the ACM 1993 3 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Taxonomy of Gesture styles context and task • manipulative – gestures which tightly related movements to an object being theory manipulated bimanual • 2D Interaction: mouse or stylus interaction • 3D Interaction: free-hand movement to mimic manipulations of physical objects pointing • deictic gestures (aimed pointing) gestures – establish identity or spatial location of an object. • semaphoric gestures (signals send to the interaction techniques computer) – stroke gestures, involve tracing of a specific path (marking in/output technologies menu) – static gestures (pose), involving no movement – dynamic gestures, require movement 4 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Taxonomy of Gesture styles context and task • pantomimic gestures: – demonstrate a specific task to be performed or theory imitated bimanual – performed without object being present. interaction • iconic pointing – communicate information about objects or entities gestures (e.g. size, shapes and motion path) • static interaction techniques e • dynamic e a in/output technologies nds b c of era, e d Literature: Aginer et al.: Understanding Mid-air Literature: Holz et al. Data Miming: Inferring Spatial Figure 1: Data miming walkthrough. The user performs ges- Hand Gestures: A Study of Human Preferences in Object Descriptions from Human Gesture, CHI Usage of Gesture Types for HCI, Tech Report 2011 Microsoft Research 5 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Taxonomy of Gesture styles “follow” before continuing, instead of performing context and task theory bimanual interaction pointing gestures interaction techniques might be combined, such as expressing “move the round block” by forming a round static in/output technologies Figure 4. The classification we used to analyze gestures in thi Literature: Aginer et al.: Understanding Mid-air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI, Tech Report Microsoft Research 6 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14 primarily in the form of “grasping” an “O” with index finger and thumb, meaning “circle”. gestures were “releasing hand gestures”, thus the oving the hand in circles, meaning “ circle”. signs (foming an “o” with
Mobile three gesture phases context and task • registration phase • continuation theory • termination bimanual interaction pointing gestures • easy to detect for touch sensitive surfaces interaction techniques • what about freehand gestures? in/output technologies 7 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Gestural Input vs. Keyboard+Mouse context and task • loosing the hover state theory • gesture design – ‘natural’ gestures bimanual interaction • dependent on culture – multi-finger chords (what does that remind pointing you of?) • memorability, learnability gestures – short-term vs. long-term retention • gesture discoverability interaction techniques • missing standards • difficult to write, keep track and in/output maintain gesture recognition code technologies – detect/resolve conflicts between gestures • and how to communicate and document a gesture? 8 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Proton++ context and task • declarative multitouch framework • enables Multitouch gesture description as theory regular expression of touch event symbols bimanual interaction • generates gesture recognizers and static analysis of gesture conflicts pointing • note: gestures – “*” kleene star indicates that a symbol can appear interaction zero or more consecutive times. techniques – “|” denotes the logical or of attribute values in/output – “ ֺ ‧ ” ֺ wildcard, specifies that an attribute can take any technologies value. Literature: Kin,K. et al. ”Proton++: A Customizable Declarative Multitouch Framework”, UIST 2012 9 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Proton++ - formal description language context and task theory bimanual interaction pointing • touch event: gestures – touch action (down, move, up) interaction – touch ID (1st, 2nd, etc.) techniques – series of touch attribute values in/output • direction = NW, hit-target = circle technologies Literature: Kin,K. et al. ”Proton++: A Customizable Declarative Multitouch Framework”, UIST 2012 10 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Proton++ context and task theory bimanual interaction pointing • stream generator gestures – converts each touch event into a touch symbol of the form interaction techniques in/output E A 1 : A 2 : A 3 ... where E ∈ {D,M,U}, attribute values A 1 :A 2 :A 3 , A 1 technologies corresponding T ID corresponds to first attribute etc. is the touch action, ple, M s : W move-with-first-touch-on-star-object-in- 1 west-direction in-west-direction Literature: Kin,K. et al. ”Proton++: A Customizable Declarative Multitouch Framework”, UIST 2012 11 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Proton++ Gesture context and task • describe a gesture as regular expression over these touch event symbols theory bimanual E A 1 : A 2 : A 3 ... where E ∈ {D,M,U}, attribute values A 1 :A 2 :A 3 , A 1 interaction T ID corresponds to first attribute etc. is the touch action, pointing gestures interaction techniques in/output technologies consider attributes: hit-target shape, direction Literature: Kin,K. et al. ”Proton++: A Customizable Declarative Multitouch Framework”, UIST 2012 12 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Mobile Proton++ Gesture context and task • describe a gesture as regular expression over these touch event symbols theory bimanual E A 1 : A 2 : A 3 ... where E ∈ {D,M,U}, attribute values A 1 :A 2 :A 3 , A 1 interaction T ID corresponds to first attribute etc. is the touch action, pointing 1 Minute Micro Task: gestures Create the regular expression for this gesture interaction techniques in/output technologies consider attributes: hit-target shape, direction Literature: Kin,K. et al. ”Proton++: A Customizable Declarative Multitouch Framework”, UIST 2012 13 LMU München — Medieninformatik — Andreas Butz, Julie Wagner — ! Mensch-Maschine-Interaktion II — WS2014/15 Slide Monday 10 November 14
Recommend
More recommend