| 研究生: |
李冠億 Kuan-Yi Lee |
|---|---|
| 論文名稱: |
基於中文自然語言理解的機械手臂控制 Robotic arm control based on Chinese Natural Language Understanding |
| 指導教授: |
陳慶瀚
Ching-Han Chen |
| 口試委員: | |
| 學位類別: |
碩士 Master |
| 系所名稱: |
資訊電機學院 - 資訊工程學系在職專班 Executive Master of Computer Science & Information Engineering |
| 論文出版年: | 2018 |
| 畢業學年度: | 106 |
| 語文別: | 中文 |
| 論文頁數: | 96 |
| 中文關鍵詞: | 自然語言 、人機互動 、語音命令 |
| 外文關鍵詞: | Natural Language, Human-Computer interaction, Voice command |
| 相關次數: | 點閱:9 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
在人工智慧引領的潮流下,服務型機器人日益普及我們生活當中,機器人的人機互動成為一項重要研究議題。對於國人而言使用中文語音實現人機互動是最具有親合力的方式,但是傳統的語音聲控技術是以固定語句轉換為單一動作的模式,除了缺乏使用彈性,系統的維護和擴充也很困難。本研究設計一個基於自然語言處理技術的中文語音控制系統,並以六軸機械手臂控制作為應用。此一系統使用了自動語音識別、中文分詞和句法解析等中介軟體,結合本文提出的參照樹架構來實現六軸機械手臂的中文語音控制。實驗結果印證本系統能以不同語者使用不同句構但具有相同語義的語音命令,使機械手臂做出正確對應語義的動作。我們同時提供一個語音命令擴充的框架,讓系統具備不同應用間的可移植性,以及可滿足使用者的個人化命令設定。
Under the trend of artificial intelligence, service robots are becoming more and more popular in our lives, and human-computer interaction of robots has become an important research topic. For Chinese people, using Chinese speech to realize human-computer interaction is the most familiar way. The traditional voice command device is a module to convert fixed statements into single actions. It’s lack the flexibility for daily used and also difficult for system maintenance and expansion. Based on natural language processing technology, it designed a Chinese speech control system with an application of 6-DOF robot arm in this research. This system used API such as automatic speech recognition, Chinese word segmentation and syntactic parsing and combined with the reference tree structure which proposed in this research to realize 6-DOF robot arm with Chinese speech control system. The experimental results show that the system can use different speech and same meaning sentences with different speaker to make the robot arm execute the correct action. We also provide a framework for voice command extensions, allowing the system to have portability between different applications and to meet the user's personalized command settings.
[1] S.-M. JI and X.-H. HUANG, "Review of development and application of industrial robot technology", Journal of Mechanical & Electrical Engineering, vol. 32, no. 1, pp. 1-13, 2015.
[2] T. Gomi, " New Al and service robots", The Industrial Robot, vol. 30, no. 2, pp. 123-138, 2003
[3] ISO Press, "Manipulating Industrial Robots – Vocabulary", ISO Standard 8373,1994
[4] D. Ball, P. Ross, A. English, P. Milani, D. Richards, A. Bate, B.Upcroft, G. Wyeth, C. Corke, "Farm Workers of the Future: Vision-Based Robotics for Broad-Acre Agriculture", IEEE Robotics & Automation Magazine, vol. 24, no. 3, pp. 97-107, 2017.
[5] E. Probst, " Flexible Robot Arm Boosts Production", Modern Machine Shop, vol. 86, no. 4, pp. 112-124, 2013.
[6] S. Berman, Y. Edan, M. Jamshidi, " Navigation of decentralized autonomous automatic guided vehicles in material handling ", IEEE Transactions on Robotics and Automation, vol. 19, no. 4, pp. 743-749, 2003.
[7] Y. Yang, L. Li, "The design and implementation of a smart e-Receptionist", IEEE Potentials, vol. 32, no. 4, pp. 22-27, 2013.
[8] Kimitoshi Yamazaki, Ryohei Ueda, Shunichi Nozawa, Mitsuharu Kojima, Kei Okada, Kiyoshi Matsumotoa, Masaru Ishikawa, Isao Shimoyam, and Masayuki Inaba, "Home-Assistant Robot for an Aging Society", Proceedings of the IEEE, vol. 100, no. 8, pp. 2429-2441, 2012.
[9] L. Zorn, F. Nageotte, P. Zanne, A. Legner, B. Dallemagne, J. Marescaux, M. Mathelin, " A Novel Telemanipulated Robotic Assistant for Surgical Endoscopy: Preclinical Application to ESD " IEEE Transactions on Biomedical Engineering, vol. 65, no. 4, pp. 797-808, 2017.
[10] M. HAEGELE, " Executive Summary World Robotics 2017 Service Robots", World Robotics - Service Robots, pp. 12-19, 2017.
[11] J. Choi, Y. Cho, J. Choi, J. Choi, "A Layered Middleware Architecture for Automated Robot Services", International Journal of Distributed Sensor Networks, vol. 2014, 2014.
[12] Y. Kim, W. Yoon, "Generating Task-Oriented Interactions of Service Robots" IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS: SYSTEMS, vol. 44, no. 8, pp. 981-994, 2014.
[13] M. Obrist, E. Gatti, E. Maggioni, C.-T. Vi, C. Velasco, "Multisensory Experiences in HCI", IEEE Computer Society, vol. 24, no. 2, pp. 9-13, 2017.
[14] W. Guo, X. Sheng, H. Liu, X. Zhu, " Toward an Enhanced Human–Machine Interface for Upper-Limb Prosthesis Control With Combined EMG and NIRS Signals" IEEE Transactions on Human-Machine Systems, vol. 47, no. 4, pp. 564-575, 2017.
[15] S. Rautaray, A. Agrawal, "Real Time Multiple Hand Gesture Recognition System for Human Computer Interaction" International Journal of Intelligent Systems and Applications vol. 4, no. 5, pp. 56-64, 2012.
[16] A. Jungk, B. Thull, L. Fehrle, A. Hoeft, " A Case Study in Designing Speech Interaction with a Patient Monitor", Journal of Clinical Monitoring and Computing, vol. 16, no. 4, pp. 295-307, 2000.
[17] J. Hirschberg, B. Ballard, D. Hindle, "Natural language processing", AT&T Technical Journal vol. 67, no. 1, pp. 41-57, 1988.
[18] V. Gatteschi, F. Lamberti, P. Montuschi, A. Sanna, "Semantics-Based Intelligent Human-Computer Interaction", IEEE Intelligent Systems, vol. 31, no. 4, pp. 11-21, 2016.
[19] M. Ralph, M. Moussa, " Toward a Natural Language Interface for Transferring Grasping Skills to Robots", IEEE Transactions on Robotics Software, vol. 24, no. 2, pp. 468-475, 2008.
[20] J. Tao, F. Zheng, A. Li, Y. Li, "Advances in Chinese Natural Language Processing and Language resources", Speech Database and Assessments, 2009 Oriental COCOSDA International Conference on, 2009.
[21] Y. Sun, " For Computers, Too, It’s Hard to Learn to Speak Chinese", MIT Technology Review:https://www.technologyreview.com/s/608249/for-computers-too-its-hard-to-learn-to-speak-chinese/, 2017.
[22] S.-K. Wong, C. Venkatratnam, " Pick and place mobile robot for the disabled through voice commands", Robotics and Manufacturing Automation (ROMA), 2016 2nd IEEE International Symposium on, 2016.
[23] S. Rosa, A. Russo, A. Saglinbeni, G. Toscana, " Vocal interaction with a 7-DOF robotic arm for object detection, learning and grasping", Human-Robot Interaction (HRI), 2016 11th ACM/IEEE International Conference on, 2008.
[24] L. F. D. Haro, R. Cordoba,J. D. Fuente, D. A. Peces, J. M. B. Mera, " Low-Cost Speaker and Language Recognition Systems Running on a Raspberry Pi", IEEE Latin America Transactions, vol. 12, no. 4, pp. 755-763, 2014.
[25] Y. J, L. She, Y. Cheng, J. Bao, J. Y. Chai, N. Xi, " Program robots manufacturing tasks by natural language instructions", Automation Science and Engineering (CASE), 2016 IEEE International Conference on, pp. 21-25, 2016.
[26] C. Chen, M. Lin and X. Guo, "High-level modeling and synthesis of smart sensor networks for Industrial Internet of Things", Computers & Electrical Engineering, vol. 61, pp. 48-66, 2017.
[27] C. Chen, M. Lin, X. Guo, "High-level Modeling and Synthesis of Smart Sensor Networks for Industrial Internet of Things", Computers & Electrical Engineering, Vol.61, July 2017, pp.48–66
[28] C. Chen, C. Kuo, C. Chen and J. Dai, "The design and synthesis using hierarchical robotic discrete-event modeling", Journal of Vibration and Control, vol.19, pp.1603-1613, in 2013
[29] "raspberry-pi-gpio-explained", technicalustad. [Online]. Available: https://technicalustad.com/raspberry-pi-gpio-explained/