Towards a Unified Gesture Description Language Florian Echtler, - - PowerPoint PPT Presentation

towards a unified gesture description language
SMART_READER_LITE
LIVE PREVIEW

Towards a Unified Gesture Description Language Florian Echtler, - - PowerPoint PPT Presentation

Towards a Unified Gesture Description Language Florian Echtler, Gudrun Klinker, Andreas Butz December 8 th , 2010 Motivation gesture-based input can mean many things touchscreens (multitouch) input from multiple users input


slide-1
SLIDE 1

Towards a Unified Gesture Description Language

Florian Echtler, Gudrun Klinker, Andreas Butz December 8th, 2010

slide-2
SLIDE 2

2/9 Towards a Unified Gesture Description Language

Motivation

gesture-based input can mean many things

  • touchscreens („multitouch“)
  • input from multiple users
  • input with multiple fingers/hands
  • multiple conventional pointers
  • tangible interfaces
  • everyday items
  • mobile devices
  • fiducial markers
  • free-air gestures

assisted (Wiimote)

  • unassisted (Kinect)

Imagine porting an iPad app to a wall display with Wiimotes...

=> is there a common denominator for these interfaces?

slide-3
SLIDE 3

3/9 Towards a Unified Gesture Description Language

Motivation

What is a gesture?

  • common gesture descriptions allow...
  • user customization
  • faster development
  • difficult question: what is a gesture?
  • answer within this context:

any motion(s) which the user executes to achieve a certain response

  • directly leads to the next question:

how can these motions be described?

slide-4
SLIDE 4

4/9 Towards a Unified Gesture Description Language

Concepts

Abstract Description of Gestures

Three core elements: Regions, Gestures, Features

slide-5
SLIDE 5

5/9 Towards a Unified Gesture Description Language

  • Regions:
  • spatial areas defined in reference coordinates
  • extension of „traditional“ WIMP-UI window objects
  • Gestures:
  • sequences of features, either ...
  • pre-defined by capability description or ...
  • customized by application
  • Features:
  • geometrical/mathematical properties of input data, e.g.:
  • motion vector
  • relative rotation
  • travelled path
  • further classification through filters and constraint values

Concepts

Abstract Description of Gestures

slide-6
SLIDE 6

6/9 Towards a Unified Gesture Description Language

  • Motion – average motion vector
  • Rotation – rotation around center of mass
  • Scale – scaling w.r.t. center of mass
  • Path – recognize „shape-based“ gestures
  • ObjectCount – number of objects inside region
  • ObjectDimensions – describes shape of object
  • ObjectOrientation – rotation relative to reference frame
  • ObjectPosition – absolute position of object
  • ObjectID – unique ID (e.g., fiducial marker) of an object
  • Concepts

Examples of Available Features

slide-7
SLIDE 7

7/9 Towards a Unified Gesture Description Language

Examples

Rotation Gesture

  • simple example: „rotate“ gesture (contains one Rotation feature)
  • rotate default RelativeObjectRotation 255 0 6.28 0
  • Name Flags Feature Mask Lower... Upper limit Result
  • result value generated through...
  • multi-finger rotation or
  • bject rotation or
  • mouse wheel or ...
slide-8
SLIDE 8

8/9 Towards a Unified Gesture Description Language

  • slightly more complex example: horizontal swipe with two fingers
  • swipe oneshot
  • Motion 1 // filters (bitmask, only match fingers)
  • 100 0 0 // lower limits
  • 1000 10 10 // upper limits
  • 0 0 0 // result (empty)

ObjectCount 1 2 2 0

  • Result (only when constraints match): (3-vector, integer) = motion vector + object count
  • mapping is dependent on...

hardware capabilities user preferences

=> application doesn't have to care

Examples

Composite Gesture

slide-9
SLIDE 9

9/9 Towards a Unified Gesture Description Language

Thank you for your attention!

Questions & comments?