| 研究生: |
趙凱泓 Kai-hung Chao |
|---|---|
| 論文名稱: |
嵌入式機器人於人體姿態辨識與模仿之實現 Realization of human postures recognition and imitation on embedded robot |
| 指導教授: |
王文俊
Wen-june Wang |
| 口試委員: | |
| 學位類別: |
碩士 Master |
| 系所名稱: |
資訊電機學院 - 電機工程學系 Department of Electrical Engineering |
| 畢業學年度: | 98 |
| 語文別: | 中文 |
| 論文頁數: | 46 |
| 中文關鍵詞: | 模糊控制器 、加速度感測器 、人型機器人 、影像處理 、人體動作辨識 |
| 外文關鍵詞: | Humanoid robot, G-sensor, Fuzzy controller, Human motion imitation, Image processing |
| 相關次數: | 點閱:20 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文主要設計一人體動作辨識系統,經由影像處理的方式辨識及紀錄人類的動作,在紀錄的同時並以人型機器人及3D模型做出即時的模仿。此外,人形機器人在模仿的同時能夠經由加速度感測器達到自主平衡的功能。機器人以攝影機當作眼睛,藉由影像處理能在複雜環境的背景中,正確辨識人體身上的特殊標記,並以標記之相對位置,辨識並記憶人體動作流程。依標記點的座標位置建立一人形之多連桿模型,再利用資料庫的標記運動軌跡,讓機器人手舞足蹈,實現模仿人類的動作。模仿的動作包含了各種變化之雙手動作和下半身動作。論文的目標為人體動作的辨識、利用3D人形模型及機器人即時模仿人類的動作。由研究成果顯示,本論文提出之方法確實能夠藉由人體上標記點的位置,並即時計算出人類運動的姿勢,再透過3D人形模型與實際機器人同時展示出人類的運動動作。
此外,機器人能夠即時模仿人類動作且保持平衡是相當困難的工作,為了防止機器人在模仿人類動作的過程中因為傾斜角度過大而傾倒,所以本論文利用加速度感測器的回授設計了模糊控制器來保持機器人的平衡。
The goal of this thesis is to design a human motion imitation system which can imitate human motions by recognizing the color marks pasted on the human body under any complex background. By the proposed system, the human motions are shown in real time in the monitor and the humanoid robot can imitate those motions simultaneously. By image processing, the system can calculate the joint angles of the human motion and take them into the 3D human model to imitate human motions. The imitated human motions include some motions such as “raise hand,” “kick forward,” “kick sideward,” and “Stand by one foot” etc. The experimental results show that the human motions can be recognized by image processing. At the same time, the human motions are imitated by a 3D human model and a real humanoid robot simultaneously. The construction of 3D human model and the interface between the human and robot are presented too.
Moreover, it is difficult for a humanoid robot to imitate human motions and keep his balance simultaneously. In order to avoid the robot falling down when he imitates human motions, we design a fuzzy controller to guarantee the stability of the robot by using a G-sensor.
[1]Retrieved 2010, from http://www.ugobe.com/pleo/index.html
[2]Retrieved 2010, from http://www.sony.jp/products/Consumer/aibo/
[3]Retrieved 2010, from http://tw.myblog.yahoo.com/tsui1king/article?mid=283
[4] K. Erbatur, A. Okazaki, K. Obiya, T. Takahashi, and A. Kawamura, “A study on the zero moment point measurement for biped walking robots,” International Workshop on Advanced Motion Control, 2002, pp. 431-436.
[5] J. H. Kim, and J. H. Oh, “Walking control of the humanoid platform KHR-1 based on torque feedback control,” IEEE International conference on Robotics and Automation, 2004, vol.1, pp. 623-628.
[6] F. Kanehiro, M. Inaba, and H. Inoue, “Development of a two-armed bipedal robot that can walk and carry objects,” IEEE/RSJ International conference on Intelligent Robots and Systems, 1996, vol.1, pp. 23-28.
[7] J. H. Park, and H. Chung, “ZMP compensation by online trajectory generation for biped robots,” IEEE International Conference on Systems, Man, and Cybernetics, 1999, vol.4, pp. 960-965.
[8] O. Huang, and Y. Nakamura, “Sensory reflex control for humanoid walking,” IEEE Transactions on Robotic and Automation, vol.21, pp. 977-984, 2005.
[9] J. W. Grizzle, G. Abba, and F. Plestan, “Asymptotically Stable Walking for Biped Robots: Analysis via Systems with Impulse Effects,” IEEE Transactions on Automatic Control, vol.46, pp. 51-64, 2001.
[10] K. Loffler, M. Gienger, F. Pfeiffer, and H. Ulbrich, “Sensors and control concept of a biped robot,” IEEE Transactions on Industrial Electronics., vol.51, pp. 972-980, 2004.
[11] S. Calinon, J. Epiney, and A. Billard, “A Humanoid Robot Drawing Human Portraits,” IEEE-RAS International Conference on Humanoid Robots, 2005, pp. 161-166.
[12] S. Degallier, C. P. Santos, L. Righetti, and A. Ijspeert, “Movement generation using dynamical systems : a humanoid robot performing a drumming task,” IEEE-RAS International Conference on Humanoid Robots, 2006, pp. 512-517.
[13] J. Solis, K. Suefuji, K. Taniguchi, and A. Takanishi, “Imitating the human flute playing by the WF-4RII: Mechanical, perceptual and performance control systems,” The First IEEE/RAS-EMBS International Conference on Biomedical Robotics and Biomechatronics, 2006, pp. 1024-1029.
[14] S. Nakaoka, A. Nakazawa, F. Kanehiro, K. Kaneko, M. Morisawa, and K. Ikeuchi, “Task model of lower body motion for a biped humanoid robot to imitate human dances?” IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005, pp. 3157-3162.
[15] X. J. Zhao, O. Huang, Z. Peng, and K. Li, “Kinematics mapping and similarity evaluation of humanoid motion based on human motion capture,” IEEE/RSJ International conference on Intelligent Robots and Systems, 2004, vol.1, pp. 840-845.
[16] L. Tanco, J. P. Bandera, R. Marfil, and F. Sandoval, “Real-time human motion analysis for human-robot interaction,” IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005, pp. 1402-1407.
[17] K. Kosuge, T. Hayashi, Y. Hirata, and R. Tobiyama, “Dance partner robot - Ms Dancer,” IEEE/RSJ International Conference on Intelligent Robots and Systems, 2003, vol. 3, pp. 3459-3464.
[18]周立柏(王文俊教授指導),“具模仿能力之智慧型雙足機器人之設計與實現”,國立中央大學電機工程學系碩士論文,2007年6月。
[19]王文俊,認識Fuzzy-第三版,全華圖書股份有限公司,2008。
[20]Retrieved 2010, from http://www.robotis.com/zbxe/main
[21]Retrieved 2010, from http://zh.wikipedia.org/zh/YUV