此页面上的内容需要较新版本的 Adobe Flash Player。

获取 Adobe Flash Player

Gesture Encoding and Reproduction Method for Text-to-Gesture System for Automatic Robot Gesture Generation


Heon-hui KIM1, Yun-su HA2, Zeung-nam BIEN3, Kwang-hyun PARK4

 

(1. Art & Robotics Institute, Kwangwoon University, Seoul 139-701, Korea;2. Division of Information Technology, Korea Maritime University, Pusan 606-791, Korea; 3. School of Electrical & Computer Engineering, Ulsan National Institute of Science and Technology, Ulsan 689-798, Korea;4. School of Robotics, Kwangwoon University, Seoul 139-701, Korea)

 

Abstract-This paper deals with gesture encoding and reproduction especially for a text-to-gesture system which can be used for humanoid robots to automatically generate proper gestures for a given text script. Focusing on smooth/natural motion generation and interpolation of multiply overlapped gestures, we propose a gesture model reflecting the unit motions in preparation, stroke, and retraction. Based on the model, a gesture encoding/decoding method is proposed to encode observed gestures symbolically and parametrically, and to reproduce robotic gestures from the codes. We also propose a gesture scheduling scheme to naturally reproduce sequential robot gestures synchronized with speech signals. Especially, when two sequential gestures are overlapped, how the unit gestures are refined is presented in algorithmic description. Finally, we discuss on the effectiveness of the proposed system by showing the results of automatic gesture generation in a story-telling example.

 

Key words-gesture encoding; text-to-gesture (TTG); automatic gesture generation; gesture reproduction

 

Manuscript Number: 1674-8042(2011)supp1.-0013-06

 

doi: 10.3969/j.issn.1674-8042.2011.supp1.004

 

References

 

[1]Spexard T, Hanheide M, Sagerer G. Human-oriented interaction with an anthropomorphic robot.  IEEE Trans. on Robotics, 2007, 23(5): 852-862.
[2]Fukuda T, Jung M J, Nakashima M, et al. Facial expressive robotic head system for human-robot communication and its application in home environment. Proceedings of the IEEE, 2004, 92(11): 1851-1865.
[3]Alvarez M, Galan R, Matia F, et al. An emotional model for a guide robot. IEEE Trans. on Systems, Man and Cybernetics—Part A: Systems and Humans, 2010, 40(5): 982-992.
[4] Zecca M, Mizoguchi Y, Endo K, et al. Whole body emotion expressions for kobian humanoid robot preliminary experiments with different emotional patterns. The 18th IEEE International Symposium on Robot and Human Interactive Communication, IEEE Press, Toyama, Japan, 2009: 381-386.
[5]Salem M, Kopp S, Wachsmuth I, et al. Generating robot gesture using a virtual agent framework. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2010: 3592-3597.
[6]Bremner P, Pipe A, Melhuish C, et al. Conversational gestures in human-robot interaction. IEEE International Conference on Systems, Man and Cybernetics(SMC 2009), San Antonio, USA, 2009:1645-1649.
[7]Ng-Thow-Hing V, Luo P, Okita S. Synchronized gesture and speech production for humanoid robots. IEEE/RSJ International Conference on in Intelligent Robots and Systems (IROS), 2010: 4617-4624.
[8]Bien Z, Lee H. Effective learning system techniques for human robot interaction in service environment. Knowledge-Based Systems, 2007, 20(5): 439-456.
[9]Kober J, Peters J. Imitation and reinforcement learning. Robotics Automation Magazine, 2010,17(2): 55-62.
[10]Kim H, Lee H, Kim Y, et al. Automatic generation of conversational robot gestures for human-friendly steward robot. The 16th IEEE International Symposium on Robot and Human interactive Communication, IEEE Press, New York, 2007: 1155-1160.
[11]McNeill D. Hand and mind: what gestures reveal about thought, University of Chicago Press, Chicago, U.S.A. 1996.
[12]Park K.-H, Lee H-E, Kim Y, et al. A steward robot for human-friendly human-machine interaction in a smart house environment. IEEE Trans. on Automation Science and Engineering, 2008, 5(1):21-25.

 

[full text view]