Human-Robot Interaction Elective in Artificial Intelligence Lecture 9 – Motion control for HRI Luca Iocchi DIAG, Sapienza University of Rome, Italy Outline Robot motion: main feature characterizing HRI from HCI • Human-robot distance control • Proxemics • Human-aware/social navigation • Gesture production • Physical HRI 2 L. Iocchi - Motion Control for HRI
Robot Navigation Well-known problem, several solutions https://qiao.github.io/PathFinding.js/visual/ 3 L. Iocchi - Motion Control for HRI Robot Navigation People are not obstacles! Treating people as obstacles: • No yielding • Blocking people (deadlocks or slow decision making) • Collisions due to unexpected behaviors 4 L. Iocchi - Motion Control for HRI
Bobily Communication Non-verbal (NV) signals Argyle M. 1998 Bodily communication • Facial expressions • Posture • Gaze • Bodily contact • Gestures and body • Spatial behaviour movements 5 L. Iocchi - Motion Control for HRI Proxemics Proxemics: study of social use and management of space • In HRI • Human-robot distance is a fundamental variable to control • Intrusion and discomfort are proportional • Proxemics concepts depend on the robot morphology and size, on the task, on the situation or status of interaction, etc. 6 L. Iocchi - Motion Control for HRI
Proxemics • Proxemics space [Hall, 1966] public zone > 3.6m social zone > 1.2m personal zone > 0.45m intimate zone < 0.45m 7 L. Iocchi - Motion Control for HRI Proxemics • Personal space is variable, continuous and context- dependent • Social Force model [Helbing and Molnar, 1995] • Personal space can be asymmetric and modulated by motion 8 L. Iocchi - Motion Control for HRI
Proxemics Personal Space 9 L. Iocchi - Motion Control for HRI Proxemics Information Process Space: space in which pedestrians take into account obstacles [Kitazawa and Fujiyama, 2010] 10 L. Iocchi - Motion Control for HRI
Proxemics Space related to group of people • O-Shape [Kendon, 2010] 11 L. Iocchi - Motion Control for HRI Proxemics Space related to activities • Activity space [Lindner and Eschenbach, 2011] 12 L. Iocchi - Motion Control for HRI
Proxemics Semantics of space: assigning a meaning to any space of the environment 13 L. Iocchi - Motion Control for HRI Proxemics Semantics of space around a person 14 L. Iocchi - Motion Control for HRI
Proxemics Semantics of person-robot distance 15 L. Iocchi - Motion Control for HRI Human-Robot Proxemics • Empirical framework [Walters et al. (2009)] • Perceptual models [Mead and Mataric, 2014] • Closeness models [Mumm, J. & Mutlu, B., 2011] • Qualitative spatial reasoning [Bhatt and Dylla, 2009] • Cognitive maps • … 16 L. Iocchi - Motion Control for HRI
Human-Robot Proxemics An empirical framework for human-robot proxemics [Walters et al. (2009)] • Approach distance based on several proxemic factors, (robot appearance, task, user preferences, etc.) • Base distance = 57 cm • Adjustment Factor -7/+13 cm 17 L. Iocchi - Motion Control for HRI Human-Robot Proxemics Perceptual model of human-robot proxemics [Mead and Mataric, 2014] How does human-robot positioning influence human-robot interaction? Bayesian network modeling relationship between pose, speech and gesture. 18 L. Iocchi - Motion Control for HRI
Human-Robot Proxemics [Mead and Mataric, 2014] Speech output over distance 19 L. Iocchi - Motion Control for HRI Human-Robot Proxemics [Mead and Mataric, 2014] Speech/gesture recognition rate over distance 20 L. Iocchi - Motion Control for HRI
Closeness models Human-Robot Proxemics: Physical and Psychological Distancing in Human-Robot Interaction [Mumm, J. & Mutlu, B., 2011] Other factors cultural background ethnic group gender age physical attractiveness body orientation 21 L. Iocchi - Motion Control for HRI Human-Robot Closeness models [Mumm, J. & Mutlu, B., 2011] Participants maintain a greater distance with the robot when the robot maintains • eye contact with them than they do when it avoids gaze. Participants maintain a greater distance with the disliked robot when the robot • maintains eye contact with them than they do when it avoids gaze, while distance is not affected by eye contact with the liked robot. Participants disclose more to the liked robot, independently by other factors (e.g., • gaze). Human-human model only partially supported in human-robot experiments. 22 L. Iocchi - Motion Control for HRI
Qualitative Spatial Reasoning Qualitative trajectory calculus (QTC) represents the relative motion of two points in a time interval with respect to the reference line that connects them on a 2D plane. 23 L. Iocchi - Motion Control for HRI Qualitative Spatial Reasoning (q1,q2,q3,q4) q1: movement of H wrt R q2: movement of R wrt H q3: movement of H wrt HR q4: movement of R wrt RH 24 L. Iocchi - Motion Control for HRI
Qualitative Spatial Reasoning 25 L. Iocchi - Motion Control for HRI Qualitative Spatial Reasoning Reasoning with QTC: QTC-MM [Bellotto et al. 2013] 26 L. Iocchi - Motion Control for HRI
Human-Aware Navigation Autonomous safe navigation following social norms • Depending on task • Unfocused vs. focused interaction (person identification) 27 L. Iocchi - Motion Control for HRI Human-Aware Navigation U nfocused interactions • Avoidance [many works] • Passing people [Pacchierotti et al., 2006] • Staying in line [Nakauchi and Simmons, 2000] 28 L. Iocchi - Motion Control for HRI
Human-Aware Navigation F ocused interactions • Approach person [Carton et al. 2012] • Follow person [Gockley et al. 2007] • Walking side-by-side [Morales Saiki et al. 2012] • Guiding [Martin et al. 2004, Hoeller et al. 2007] 29 L. Iocchi - Motion Control for HRI Human-Aware Navigation Human-robot encounters in a hallway 1. Upon entering the social space of the person, move to the right (wrt. to the robot reference frame) to signal the person that s/he has been detected. 2. Move to the right of the hallway, while passing the person. 3. Return to normal navigation after the person has fully passed by. 30 L. Iocchi - Motion Control for HRI
Human-Aware Navigation Human-robot encounters in a hallway 31 L. Iocchi - Motion Control for HRI Human-Aware Navigation Follow person • Person tracking + target following • Use verbal feedback to inform and keep engagement 32 L. Iocchi - Motion Control for HRI
Gesture production [Salem et al. 2012] • Gestures improve engagement in interaction (several user studies) • Synchronized voice and gestures • Synchronized gestures and facial expressions 33 L. Iocchi - Motion Control for HRI Gesture production Exercises with Pepper robot (next lecture) 34 L. Iocchi - Motion Control for HRI
Face expression production Kobian at Waseda University 35 L. Iocchi - Motion Control for HRI Face and gesture production Kobian at Waseda University 36 L. Iocchi - Motion Control for HRI
Physical HRI Prof. A. De Luca. Robotics 2 • Main focus is on safety • Variable stiffness actuators 37 L. Iocchi - Motion Control for HRI References Bellotto, N., Hanheide, M. & Weghe, N. Van De. Qualitative design and implementation of human robot spatial interactions. Int Conf Social Robotics 2013. Bhatt, M. and Dylla, F. A qualitative model of dynamic scene analysis and interpretation in ambient intelligence systems. International Journal of Robotics and Automation, 24(3), 2009. E. T. Hall. The Hidden Dimension. Doubleday, New York, 1966. Helbing, D. and Molnar, P. Social force model for pedestrian dynamics. PHYSICAL REVIEW E, 51:4282, 1995. 38 L. Iocchi - Motion Control for HRI
References F. Hoeller, D. Schulz, M. Moors and F.E. Schneider. Accompanying persons with a mobile robot using motion prediction and probabilistic roadmaps. In Proc. Of Intern. Conf. on Intelligent Robots and Systems, IROS 2007. A. Kendon. Spacing and orientation in co-present interaction. In Development of Multimodal Interfaces: Active Listening and Synchrony, LNCS 5967, 2010 Lindner, F. and Eschenbach, C. (2011). Towards a formalization of social spaces for socially aware robots. In Proceedings of the 10th international conference on Spatial information theory, COSIT’11. 39 L. Iocchi - Motion Control for HRI References C. Martin, H.J. Bohme and H.M. Gross. Conception and realization of a multi- sensory interactive mobile office guide. In IEEE International Conference on Systems, Man and Cybernetics, vol. 6, 2004. Mead, R. and Matari ć , M.J. Perceptual Models of Human-Robot Proxemics, In the Proceedings of the 2014 International Symposium on Experimental Robotics (ISER), 2014. Mumm, J. & Mutlu, B., 2011. Human-robot proxemics. In Proceedings of the 6th international conference on Human-robot interaction - HRI ’11. New York, New York, USA: ACM Press. 40 L. Iocchi - Motion Control for HRI
References M. Salem, S. Kopp, I. Wachsmuth, K. Rohlfing, F. Joublin. Generation and Evaluation of Communicative Robot Gesture International Journal of Social Robotics., 4(2), 201-217, 2012 Walters, M. L., Dautenhahn, K., te Boekhorst, R., Koay, K. L., Syrdal, D. S., and Nehaniv., C. L. (2009). An empirical framework for human-robot proxemics. Proceedings New Frontiers in Human-Robot Interaction. 41 L. Iocchi - Motion Control for HRI
Recommend
More recommend