| 研究生: |
張雅如 Ya-Ju Chang |
|---|---|
| 論文名稱: |
基於Kinect影像處理之虛擬鍵盤與滑鼠 |
| 指導教授: | 王文俊 |
| 口試委員: | |
| 學位類別: |
碩士 Master |
| 系所名稱: |
資訊電機學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2014 |
| 畢業學年度: | 102 |
| 語文別: | 中文 |
| 論文頁數: | 87 |
| 中文關鍵詞: | Kinect 、虛擬鍵盤 、虛擬滑鼠 |
| 相關次數: | 點閱:9 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本論文利用Kinect與微型投影機搭配影像處理之技術實現實際電腦之虛擬鍵盤與滑鼠以及掃描功能。虛擬鍵盤與滑鼠的實現是藉由投影機投射鍵盤與滑鼠畫面至桌面上,透過Kinect擷取使用者的手指彩色影像與深度資訊搭配影像處理技術,讓使用者在虛擬鍵盤與滑鼠上能夠如實體一般使用實體鍵盤及滑鼠之功能。掃描功能的實現是利用Kinect掃描使用者在投射畫面指定的兩點對角線所形成矩形區域之彩色影像,同時將掃描後的圖縮小,顯示在虛擬介面上,供使用者查看,使用者還可利用手指在虛擬介面之掃描圖像上點選並移動,即可將掃描圖檔做儲存、移除或者將點選的掃描圖像顯示在主畫面上。此掃描功能可隨手將書面之雜亂筆記電子化,不需再使用其餘攝影工具。
This study utilizes Kinect and micro-projector with image processing techniques to implement a virtual keyboard, a virtual mouse and a scanning function for a real computer. The implementation of a virtual keyboard and mouse is performed to project a virtual interface on the table by using the micro-projector and then Kinect captures the skin color and depth image of the user’s finger through image processing techniques such that the user can manipulate the virtual keyboard and mouse function as real devices. Regarding the scanning function, Kinect can capture the two points on the virtual interface by the user’s finger and the two points are as the diagonal points to form a rectangular area. Then the rectangular area can be scanned by Kinect. The scanned image size can be also reduced and displayed on the virtual interface for users. The user can use finger to click and move the scanned image to save, delete or show the scanned image on the real monitor. The scanning function can save anything on the virtual interface without other camera tools.
[1] P. Mistry, P. Maes, “ SixthSense – A Wearable Gestural Interface,” In Proceedings of SIGGRAPH Asia 2009, Sketch., Yokohama, Japan. 2009, pp.11.
[2] Kinect試衣鏡相關網站,
https://www.youtube.com/watch?v=w9X7LF1HUJY,2013年1月。
[3] M. C. Roh, S. J. Huh, and S. W. Lee, “A virtual mouse interface based on two-layered Bayesian network, ” in Proceedings of IEEE Workshop on Applications of Computer Vision (WACV), Snowbird, USA, December 2009, pp. 1-6.
[4] Y. Fu and T. S. Huang, “hMouse: Head Tracking Driven Virtual Computer Mouse, ” in Proceedings of IEEE Workshop on Applications of Computer Vision (WACV) , Austin, USA, February 2007, pp. 30-35.
[5] E. Y. Kim, S. K. Kang, K. Jung, and H. J. Kim, “Eye mouse : mouse implementation using eye tracking,” in Proceedings of IEEE International Conference on Consumer Electronics, Las Vegas, USA, January 2005, pp. 207-208.
[6] J. Tu, T. Huang, and H. Tao “Face as mouse through visual face tracking,” in Proceedings of Canadian Conference on Computer and Robot Vision, Victoria, Canada, May 2005, pp. 339-346.
[7] S. Szeghalmy, M. Zichar, and A. Fazekas, “Comfortable mouse control using 3D depth sensor,” in Proceedings of IEEE International Conference on Cognitive InfoCommunications, Budapest, Hungary, December 2013, pp. 219-222.
[8] P. Hong and T. Huang, “Natural mouse – a novel human computer interface,” in Proceedings of IEEE International Conference on Image processing, Kobe, Japan, October 1999, pp. 653-656.
[9] M. E. Erdem, I. A. Erdem, V. Atalay, and A. E. Cetin, “Computer vision based unistroke keyboard system and mouse for the handicapped,” in Proceedings of IEEE International Conference on Multimedia and Expo, Baltimore, USA, July 2003, pp. II – 765 - 8.
[10] Z. Zheng, K. Yang, and J. Pei, “Design and implement of a kind of virtual keyboard based on microcomputer and CMOS camera,” in Proceedings of IEEE International Conference on Communication Technology, Jinan, China, September 2011, pp. 333-336.
[11] Y. Adajania, J. Gosalia, A. Kanade, H. Mehta, and Prof. N. Shekokar, “Design and implement of a kind of virtual keyboard based on microcomputer and CMOS camera,” in Proceedings of IEEE International Conference on Emerging Trends in Engineering and Technology, Goa, India, November 2010, pp. 163-165.
[12] E. Posner, N. Starzicki, and E. Katz, “A single camera based floating virtual keyboard with improved touch detection,” in Proceedings of IEEE Convention of Electrical & Electronics Engineers in Israel, Eilat, Israel, November 2012, pp. 1-5.
[13] S. Hernanto and I. S. Suwardi, “Webcam virtual keyboard,” in Proceedings of IEEE International Conference on Electrical Engineering and Informatics, Bandung, Indonesia, July 2011, pp. 1-5.
[14] S. Zhang, W. He, Q. Yu, and X. Zheng, “Low-cost interactive whiteboard using kinect,” in Proceedings of IEEE International Conference on Image Analysis and Signal Processing, Hangzhou, China, November 2012, pp. 1-5.
[15] D. Matsuda, K. Uemura, N. Sakata, and S. Nishida, “Toe input using a mobile projector and Kinect sensor,” in Proceedings of IEEE International Symposium on Wearable Computers, Newcastle, United Kingdom, June 2012, pp. 48-51.
[16] X. Wu, C. Yang, Y. Wang, H. Li, and S. Xu, “An intelligent interactive system based on hand gesture recognition algorithm and kinect,” in Proceedings of IEEE International Symposium on Computer Intelligence and Design, Hangzhou, China, October 2012, pp. 294-298.
[17] T. Hongyong and Y. Youling, “Finger tracking and gesture interaction with kinect,” in Proceedings of IEEE International Conference on Computer and Information Technology, Chengdu, China, October 2012, pp. 214-218.
[18] S. Murugappan, Vinayak, N. Elmqvist, and K. Ramani, “Extended multitouch : recovering touch posture and differentiating users using a depth camera,” in Proceedings of the annual ACM Symposium on User Interface Software and technology, Cambridge, USA, October 2012, pp. 487-496.
[19] H. Benko and A. D. Wilsom, “Depth touch : using depth-sensing camera to enable freehand interactions on and above the interactive surface,” Microsoft Research Technical Report MSR-TR-2009-23.
[20] Y. Sato, Y. Kobayashi, and H. Koike, “Fast tracking of hands and fingertips in infrared images for augmented desk interface,” in Proceedings of IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France, March 2000, pp. 462-467.
[21] Fujitsu 觸控介面相關網站,
http://www.diginfo.tv/v/13-0025-r-en.php,2013年4月。
[22] David Catuhe,寫給專業開發者用的Windows Kinect SDK 技術手冊,博碩文化股份有限公司,2013。
[23] 余濤,Kinect應用開發實戰 - 未來世界的人機介面,上奇資訊股份有限公司,2013年。
[24] 王森,KINECT體感程式設計入門,碁峯資訊股份有限公司,2012年。
[25] 倉達 慶仁,圖解數位影像處理程式範例教本使用C語言,2011年。
[26] M. M.Aznaveh, H. Mirzaei, E. Roshan, and M. Saraee, “A new color based method for skin detection using RGB vector space,” in Proceedings of IEEE International Conference on Human System Interaction, Krakow, Poland, May 2008, pp. 932-935.