此页面上的内容需要较新版本的 Adobe Flash Player。

获取 Adobe Flash Player

Development of a Musical-based Interaction System for the Waseda Flutist Robot-Implementation of a Real-time Vision Interface Based onthe Particle Filter Algorithm

Jorge Solis, Atsuo Takanishi

 

Faculty of Science and Engineering, Waseda University, Tokyo 169 -8555, Japan

 

Abstract-The aim of this paper is to create an interface for h uman-robot interaction. Specifically, musical performance parameters (i.e.vibra to expression) of the Waseda Flutist Robot No.4 Refined IV (WF-4RIV) are to be  manipulated. This research focused on enabling the WF-4RIV to interact with hum an players (musicians) in a natural way. In this paper, as the first approach, a  vision processing algorithm, which is able to track the 3D-orientation and pos ition of a musical instrument, was developed. In particular, the robot acquires  image data through two cameras attached to its head. Using color histogram match ing and a particle filter, the position of the musician’s hands on the inst rument are tracked. Analysis of this data determines orientation and location of  the instrument. These parameters are mapped to manipulate the musical expressio n of the WF-4RIV, more specifically sound vibrato and volume values. The author s present preliminary experiments to determine if the robot may dynamically chan ge musical parameters while interacting with a human player (i.e. vibrato etc.).  From the experimental results, they may confirm the feasibility of the interact ion during the performance, although further research must be carried out to con sider the physical constraints of the flutist robot.

 

Key words-humanoids; human-robot interaction; vision;  music

 

Manuscript Number: 1674-8042(2010)02-0133-05

 

dio: 10.3969/j.issn.1674-8042.2010.02.08

 


References

 

[1]A. Kapur, 2005. A History of Robotic Musical Instruments. Proc. of t he International Computer Music Conference, p.21-28.

[2]J. Solis, A. Takanishi, 2007. An Overview of the Research Approaches  on Musical Performance Robots. Proc of the International Computer Music Confere nce, p. 356-359.

[3]S. Sugano, I. Kato,  WABOT-2: Autonomous Robot with Dex-terous Fin ger-arm Coordination Control in Keyboard Performance. Proc. of the Int. Confere nce on Robotics and Automation, p. 90-97, 198.

[4]J. Solis, K. Chida, K. Suefuji, A. Takanishi, 2006. The development  of the anthropomorphic flutist robot at Waseda University. Internation al Journal of Humanoid Robots, 3(2): 127-151.

[5]S. Takashima, T. Miyawaki, 2006. Control of an Automatic Performance  Robot of Saxophone: Performance Control Using Standard MIDI Files. Proc. of the  IROS Workshop on Musical Performance Robots and Its Applications, p. 30-35.

[6]K. Shibuya, 2007. Toward Developing a Violin Playing Robot: Bowing b y Anthropomorphic Robot Arm and Sound Analysis. Proc. of ROMAN, p. 763-768.

[7]H. Kuwabara, M. Shimojo, 2006. The Development of a Violin Musician   Robot. IROS06  Workshop on Musical Performance Robots and Its Applications, p.1 8-23.

[8]A. Kapur, E. Singer, M. Benning, G. Tzanetakis, 2007. Integrating Hy per Instruments, Musical Robots & Machine Musicianship for North Indian Classica l Music. Proc. of the New Interfaces for Musical Expression, p. 238-241.

[9]R. B. Dannenberg, B. Brown, G. Zeglin, R. Lupish, 2005. McBlare: A R obotic Bagpipe Player. Proc. of the NIME2005, Vancouver.

[10]K. Beilharz, 2004. Interactively Determined Generative Sound Design  for Sensate Environments: Extending Cyborg Control.  Y. Pisan (eds), Interacti ve Entertainment, p.11-18.

[11]E. Hayashi, 2006. Development of an Automatic Piano that Produce Ap propriate: Touch for the Accurate Expression of a Soft Tone.  Proc. of IROS Work shop on Musical Performance Robots and Its App., p. 7-12.

[12]E. Singer, 2004. LEMURs Musical Robots. Proc. of the Conference on  New Interfaces for Musical Expression, p. 183-184.

[13]G. Weinberg, S.  Driscoll, 2007. The Design of a Perceptual and Imp rovisational Robotic Marimba Player. Proc. of the 16th IEEE International Confer ence on Robot & Human Interactive Communication, p. 769-774.

[14]J. Solis, K. Taniguchi, T. Ninomiya, T. Yamamoto, A. Takanishi, 200 8. Development of Waseda Flutist Robot WF-4RIV: Implementation of Auditory Feed back System. Proc.  of International Conference  on Robotics and Automation, p.  3654-3659.

[15]J. Solis, S. Isoda, K. Chida, A. Takanishi, K. Wakamatsu, 2004. Ant hropomorphic Flutist Robot for Teaching Flute Playing to Beginner Students. Proc . of the IEEE  International Conference on Robotics and Automation, p. 146-150.

[16]J. Solis, K. Suefuji, K. Chida, A. Takanishi, 2006. Imitating the H uman Flute Playing by the WF-4RII: Mechanical, Perceptual and Performance Contr ol Systems. Proceedings of the 1st IEEE/RAS- EMBS International Conference on B iomedical Robotics and Biomechatronics, p. 1024-1029.

[17]J. Solis, K. Taniguchi, T. Ninomiya, T. Yamamoto, A. Takanishi, 200 8. The Mechanical Improvements of the Waseda Flutist Robot and the Implementatio n of an Auditory Feedback Control System. Proc. of the 17th CISM-IFToMM Symposi um on Robot Design, Dynamics, and Control, p. 217-224.

[18]J. Solis, K. Suefuji, K. Taniguchi, T. Ninomiya, T. Yama-moto, A.  Takanishi, 2006. Towards an Autonomous musical teaching system from the Waseda F lutist Robot to Flutist Beginners. Proc. of IEEE/RSJ International Conference on  Intelligent Robots and Systems-Workshop: Musical Performance Robots and Its Ap plications, p. 24-29.

[19]R. F. D. Saxe, 1996. Toward Robust Skin Identification in Video Ima ges. 2nd International Conference on Automatic Face and Gesture Recognition, p.  379.

[20]M. S. Arulampalam, S. Maskell, N. Gordon, T. Clapp, 2002. A tutoria l on particle filters for online nonlinear/non-GaussianBayesian tracking. IEEE Transactions on Signal Processing, 50(2): 174-188.
 

 

[full text view]