Auditory User Interface Design Human-Computer I nteraction Myounghoon “Philart” Jeon Mind Music Machine Lab Center of Cyber-Human Systems Cognitive Science, Computer Science CS 1000 – October 13, 2015
Auditory User Interface Design Philart’s Personal…
Auditory User Interface Design Background & Teaching Educational Background Experience wrt HCI • PhD Engineering Psychology (HCI), Georgia Institute of Technology (2012) 1 HCI Researcher @Daum • M.S. Engineering Psychology, Georgia Comm., UX/UI Designer & Institute of Technology (2010) • M.S. Cognitive Science, Yonsei Sound Designer @LG Elec. University in Korea (2004) • B.A. Sociology, Yonsei University in Co-work with SS, H/K Motors, 2 Korea (2000) Toyota, GE, Panasonic, etc. • B.A. Psychology, Yonsei University in Korea (2000) • Film Scoring Expert Institute, Yonsei Best Papers (HFES, HCII), University in Korea (2007) 3 Ergonomic Design Award, IF Comm. Teaching Design Award • Human-Computer Interaction/ HCD • Affective Design and Computing HFES, CHI, HCII, MobileHCI, 4 • Human Factors ASSETS, CSUN, ICAD, • Human Factors II: Multimodal Design & AutomotiveUI, UbiComp, etc. Measure Studio
What type of produ[je]cts? Auditory User Interface Design AUI GUI LUI
Auditory User Interface Design Academic Origin: Cognitive Sciences (Cognitive Engineering)
Auditory User Interface Design In fact, Affective Sciences
Auditory User Interface Design The tri-M Lab Mind Music Machine
Auditory User Interface Design The tri-M Lab 6 + 2 Graduates (Human Factors + Computer Science) 8 Undergraduates (CS, CE, Psy, Sound Design, ME)
Center of Cyber-Human Systems, Institute of Computing and Cybersystems Auditory User Interface Design Human-Centered Design: Designing systems of the users, by the users, and for the users. We are interested in People, Art, Design, Technology, & eXperiences A UDITORY DISPLAYS & SONIFICATION A FFECTIVE A UGMENTED & COMPUTING VIRTUAL REALITY Human-Centered Computing A A SSISTIVE UTOMOTIVE UI TECHNOLOGY
Auditory User Interface Design The tri-M Lab Google “mind music machine lab” Or email philart@gmail.com or mjeon@mtu.edu Mind Music Machine
Auditory User Interface Design Son onification in V n VR Goal Expand a nd artists’ e emotiona nal e expres essions ns a and a d aesthet etic dimens ensions using v g visualiz ization ion a and s sonif ific ication ion a at the i imm mmers rsiv ive v virt rtual e environ ironment
System Configuration Auditory User Interface Design • Vicon Tracker – 12 infrared cameras – 120 Hz – Sub-millimeter precision • Display Wall – 24, 42’’ Monitors • OpenGL (C++) • JFugue Library for audio output • ISML – GUI interface for customizing sonification par ameters
System Configuration Auditory User Interface Design Fig. 1. The Vicon tracker sends the signal to (1) the visualizer (head node), which distributes it to 8 tail nodes, each of which is connected to 3 multivisions; and (2) the sonifier via the scripting language.
Interactive Map Auditory User Interface Design
Virtual Instrument Auditory User Interface Design
Auditory User Interface Design Tony Orrico… Based i ed in Chicago, c crea eates es large geometric p piec eces es, “Pen enwald D Drawings”
Auditory User Interface Design Embodied Penwald Drawings “Orr rrico l laid face d dow own on on a pie iece of of paper h hol olding gra graphite p pencils in in both ha hands. H . He pu push shed o off a wall, j ll, jetting hi himself lf forward o on top o p of the he pi piece. H . He dragged hi his g s graphite pe pencils ls a along with hi him; a as s he he w writhed hi his s wa way ba back to the s e starting ng p position o n over a and o d over a again, n, h he l e lef eft beh behind nd hi himself lf a a pi pictorial hi l hist story o of hi his m s motion.” .” “He k knelt lt o on a large she sheet o of pa paper, st striking i it with g graphite a as s he he sw swung hi his a s arms i in a pe pendula lar m motion, a , and slowl wly r y revolved a ed atop the e mat.”
Multiple Layers of Outcomes Auditory User Interface Design The e outcomes es o of our c collaboration n and T d Tony’ y’s works w were d e displayed yed in the F Finnis ish Ame meric rican Herit ritage ge C Center i r in Hancock, M MI.
Research in Progress Auditory User Interface Design Crea eativity & y & Intent entiona nality y
Auditory User Interface Design Automotive User Interfaces & ITS 01. 01. Warning ng D Des esign gn 02. 02. Social C l Car 03. 03. Emotio ional D l Dri rivin ing
Auditory User Interface Design Goal Ta Takin ing d g driv ivers rs’ e emot motio ions a and a affect i into a o accou ount, improve road s d safet ety b y by estimating ng a a driver er’s affec ective s e states es a and d interv rvenin ing w g with d dynamic ic t technol ologi ogies
Driving Simulators in tri-M Lab Auditory User Interface Design
Results from 8 Experiments Auditory User Interface Design
Facial Expression Detection Systems Auditory User Interface Design Our first sy syst stem u use ses t s the he S Support- Ve Vector M or Machin ines ( (SVMs) a algor orit ithm, whi hich c could ld d detect po posi sitive, n , negative, , and n neutral a l affective st states. Ou s. Our se second syst sy stem u use ses t s the he V Viola la-Jo Jone nes o obj bject det detection f n framework, wh which could det detect more s e specific affec ective s e states es, i includi ding anger, h happi ppiness ss, a and s surpr prise se.
Research in Progress Auditory User Interface Design Table 2 . Mapping variables for observation states and sonification parameters Observation States Sonification Parameters (SP) Affective States (AS) Driving Behaviors (DB) - FacialExpression: s FEX - LaneDeviation: s LD Musical Parameters (MP) - FacialEMG: s FEMG - SteeringWheelAngle: s SWA - Genre: c GE - EyeMovementPattern: s EMP - Speed: s SP - Key: c KEY - HeartRate: s HR - Pedal Force: s PF - Tempo: c TE - Respiration: s RE - Collision: s CO Human Factors (HF) - SkinConductance: s SC - Familiarity: c FA - BrainWaves: s EEG - Preference: c PR - Expectation: c EX System Factors (SF) - Timing: c TI - Duration: c DU - Regularity: c RE - Interference: c IN ObservationStates = AS(s FEX, s FEMG, s EMP, s HR, s RE, s SC, s EEG ) x DB(s LD, s SWA, s SP, s PF, s CO ) SonificationParameters = MP(c GE, c KEY, c TE ) x HF(c FA, c PR , c EX ) x SF(c TI, c DU, c RE, c IN ) SonificationOutputs = f(ObservationStates x SonificationParameters) Interm rmit ittent s sonif ific icatio ion b based o on d driver r affectiv ive s states a and b behavior iors Contin inuou ous s sonif ific icatio ion u using mu g multis istre ream s soundscapes
Auditory User Interface Design Assistive Technologies & Accessible Computing 01. 01. Navig igatio ion f for r Bli lind 02. Di 02. Digital L Literacy for O OAs 03. 03. SocialBot f for Autism
Auditory User Interface Design Goal Facil ilit itate s socia ial a and e emot motion ional i intera raction ion o of chil ildre ren w with A ASD usi sing phy physical l and m musical st l stimuli li
Emotion Recognition Research Auditory User Interface Design
Research Concept Diagram Auditory User Interface Design “How much t the hey qu quest stioned t the he n nature of art?” “What t they a y added t ded to the c concep eption o n of art?” Re Research ch As Aspects • Platfor orm-free sonif ific icatio ion s server • Estimation a a chi hild’s af affective s sat ates an and ov overal all interac action pat patterns w with a a rob obot • Rob obotic l c lear arning of of hu human b behav aviors f for or i incr creasing t the he engagem emen ent
Research in Progress Auditory User Interface Design Res Research: Robot A Accep eptanc nce e Hum uman an-Ro Robo bot T Team I Inter eraction
Auditory User Interface Design Thank You
Recommend
More recommend