跳到主要內容

簡易檢索 / 詳目顯示

研究生: 張雅如
Ya-Ju Chang
論文名稱: 基於Kinect影像處理之虛擬鍵盤與滑鼠
指導教授: 王文俊
口試委員:
學位類別: 碩士
Master
系所名稱: 資訊電機學院 - 電機工程學系
Department of Electrical Engineering
論文出版年: 2014
畢業學年度: 102
語文別: 中文
論文頁數: 87
中文關鍵詞: Kinect虛擬鍵盤虛擬滑鼠
相關次數: 點閱:9下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 本論文利用Kinect與微型投影機搭配影像處理之技術實現實際電腦之虛擬鍵盤與滑鼠以及掃描功能。虛擬鍵盤與滑鼠的實現是藉由投影機投射鍵盤與滑鼠畫面至桌面上,透過Kinect擷取使用者的手指彩色影像與深度資訊搭配影像處理技術,讓使用者在虛擬鍵盤與滑鼠上能夠如實體一般使用實體鍵盤及滑鼠之功能。掃描功能的實現是利用Kinect掃描使用者在投射畫面指定的兩點對角線所形成矩形區域之彩色影像,同時將掃描後的圖縮小,顯示在虛擬介面上,供使用者查看,使用者還可利用手指在虛擬介面之掃描圖像上點選並移動,即可將掃描圖檔做儲存、移除或者將點選的掃描圖像顯示在主畫面上。此掃描功能可隨手將書面之雜亂筆記電子化,不需再使用其餘攝影工具。


    This study utilizes Kinect and micro-projector with image processing techniques to implement a virtual keyboard, a virtual mouse and a scanning function for a real computer. The implementation of a virtual keyboard and mouse is performed to project a virtual interface on the table by using the micro-projector and then Kinect captures the skin color and depth image of the user’s finger through image processing techniques such that the user can manipulate the virtual keyboard and mouse function as real devices. Regarding the scanning function, Kinect can capture the two points on the virtual interface by the user’s finger and the two points are as the diagonal points to form a rectangular area. Then the rectangular area can be scanned by Kinect. The scanned image size can be also reduced and displayed on the virtual interface for users. The user can use finger to click and move the scanned image to save, delete or show the scanned image on the real monitor. The scanning function can save anything on the virtual interface without other camera tools.

    摘要 i Abstract ii 誌謝 iii 目錄 iv 圖目錄 viii 表目錄 xi 第一章 緒論 1 1.1 研究背景與動機 1 1.2 文獻回顧 2 1.3 論文目標 4 1.4 論文架構 4 第二章 系統架構與軟硬體介紹 5 2.1 系統架構 5 2.2 硬體介紹 6 2.2.1 電腦端 6 2.2.2 Kinect介紹 7 2.2.3 投影機介紹 9 2.3 軟體介紹 10 第三章 資料前處理與系統初始化設定 12 3.1 色彩空間模型與轉換 12 3.2 前景擷取及影像前處理 13 3.2.1 背景相減 14 3.2.2 形態學處理 17 3.2.3 中值濾波 18 3.3 投影位置與閥值初始化設定 19 3.3.1 投影位置定位 19 3.3.2 分配各功能區域範圍 23 3.3.3 使用者參數 27 3.4 背景更新 29 第四章 虛擬鍵盤、滑鼠與掃描功能 32 4.1 虛擬鍵盤 32 4.1.1 按鍵權重 33 4.1.2 資料壓縮 34 4.2 虛擬滑鼠 35 4.2.1 滑鼠游標更新 37 4.2.2 左右鍵功能 41 4.2.3 拖曳功能 41 4.3 掃描功能 42 4.3.1 擷取與顯示掃描範圍 43 4.3.2 紀錄掃描範圍影像及顯示小圖 45 4.3.3 儲存、移除以及顯示掃描影像於主螢幕 45 4.3.4 開關顯圖功能 47 第五章 實驗成果 49 5.1 實驗環境介紹 49 5.2實驗成果 50 5.2.1 定位功能 50 5.2.2 虛擬鍵盤功能 51 5.2.3 虛擬滑鼠功能 54 5.2.4 掃描功能 59 5.2.5 開關顯圖功能 60 5.2.6 背景更新功能 61 5.3 使用者須知 62 5.3.1 定位功能注意事項 62 5.3.2 偵測使用者手指注意事項 63 5.3.3 滑鼠移動功能注意事項 63 5.3.4 鍵盤功能注意事項 63 5.3.5 掃描功能注意事項 63 5.4 架設高度與鍵盤準確度 64 第六章 結論與未來展望 69 6.1 結論 69 6.2 未來展望 70 參考文獻 71

    [1] P. Mistry, P. Maes, “ SixthSense – A Wearable Gestural Interface,” In Proceedings of SIGGRAPH Asia 2009, Sketch., Yokohama, Japan. 2009, pp.11.
    [2] Kinect試衣鏡相關網站,
    https://www.youtube.com/watch?v=w9X7LF1HUJY,2013年1月。
    [3] M. C. Roh, S. J. Huh, and S. W. Lee, “A virtual mouse interface based on two-layered Bayesian network, ” in Proceedings of IEEE Workshop on Applications of Computer Vision (WACV), Snowbird, USA, December 2009, pp. 1-6.
    [4] Y. Fu and T. S. Huang, “hMouse: Head Tracking Driven Virtual Computer Mouse, ” in Proceedings of IEEE Workshop on Applications of Computer Vision (WACV) , Austin, USA, February 2007, pp. 30-35.
    [5] E. Y. Kim, S. K. Kang, K. Jung, and H. J. Kim, “Eye mouse : mouse implementation using eye tracking,” in Proceedings of IEEE International Conference on Consumer Electronics, Las Vegas, USA, January 2005, pp. 207-208.
    [6] J. Tu, T. Huang, and H. Tao “Face as mouse through visual face tracking,” in Proceedings of Canadian Conference on Computer and Robot Vision, Victoria, Canada, May 2005, pp. 339-346.
    [7] S. Szeghalmy, M. Zichar, and A. Fazekas, “Comfortable mouse control using 3D depth sensor,” in Proceedings of IEEE International Conference on Cognitive InfoCommunications, Budapest, Hungary, December 2013, pp. 219-222.
    [8] P. Hong and T. Huang, “Natural mouse – a novel human computer interface,” in Proceedings of IEEE International Conference on Image processing, Kobe, Japan, October 1999, pp. 653-656.
    [9] M. E. Erdem, I. A. Erdem, V. Atalay, and A. E. Cetin, “Computer vision based unistroke keyboard system and mouse for the handicapped,” in Proceedings of IEEE International Conference on Multimedia and Expo, Baltimore, USA, July 2003, pp. II – 765 - 8.
    [10] Z. Zheng, K. Yang, and J. Pei, “Design and implement of a kind of virtual keyboard based on microcomputer and CMOS camera,” in Proceedings of IEEE International Conference on Communication Technology, Jinan, China, September 2011, pp. 333-336.
    [11] Y. Adajania, J. Gosalia, A. Kanade, H. Mehta, and Prof. N. Shekokar, “Design and implement of a kind of virtual keyboard based on microcomputer and CMOS camera,” in Proceedings of IEEE International Conference on Emerging Trends in Engineering and Technology, Goa, India, November 2010, pp. 163-165.
    [12] E. Posner, N. Starzicki, and E. Katz, “A single camera based floating virtual keyboard with improved touch detection,” in Proceedings of IEEE Convention of Electrical & Electronics Engineers in Israel, Eilat, Israel, November 2012, pp. 1-5.
    [13] S. Hernanto and I. S. Suwardi, “Webcam virtual keyboard,” in Proceedings of IEEE International Conference on Electrical Engineering and Informatics, Bandung, Indonesia, July 2011, pp. 1-5.
    [14] S. Zhang, W. He, Q. Yu, and X. Zheng, “Low-cost interactive whiteboard using kinect,” in Proceedings of IEEE International Conference on Image Analysis and Signal Processing, Hangzhou, China, November 2012, pp. 1-5.
    [15] D. Matsuda, K. Uemura, N. Sakata, and S. Nishida, “Toe input using a mobile projector and Kinect sensor,” in Proceedings of IEEE International Symposium on Wearable Computers, Newcastle, United Kingdom, June 2012, pp. 48-51.
    [16] X. Wu, C. Yang, Y. Wang, H. Li, and S. Xu, “An intelligent interactive system based on hand gesture recognition algorithm and kinect,” in Proceedings of IEEE International Symposium on Computer Intelligence and Design, Hangzhou, China, October 2012, pp. 294-298.
    [17] T. Hongyong and Y. Youling, “Finger tracking and gesture interaction with kinect,” in Proceedings of IEEE International Conference on Computer and Information Technology, Chengdu, China, October 2012, pp. 214-218.
    [18] S. Murugappan, Vinayak, N. Elmqvist, and K. Ramani, “Extended multitouch : recovering touch posture and differentiating users using a depth camera,” in Proceedings of the annual ACM Symposium on User Interface Software and technology, Cambridge, USA, October 2012, pp. 487-496.
    [19] H. Benko and A. D. Wilsom, “Depth touch : using depth-sensing camera to enable freehand interactions on and above the interactive surface,” Microsoft Research Technical Report MSR-TR-2009-23.
    [20] Y. Sato, Y. Kobayashi, and H. Koike, “Fast tracking of hands and fingertips in infrared images for augmented desk interface,” in Proceedings of IEEE International Conference on Automatic Face and Gesture Recognition, Grenoble, France, March 2000, pp. 462-467.
    [21] Fujitsu 觸控介面相關網站,
    http://www.diginfo.tv/v/13-0025-r-en.php,2013年4月。
    [22] David Catuhe,寫給專業開發者用的Windows Kinect SDK 技術手冊,博碩文化股份有限公司,2013。
    [23] 余濤,Kinect應用開發實戰 - 未來世界的人機介面,上奇資訊股份有限公司,2013年。
    [24] 王森,KINECT體感程式設計入門,碁峯資訊股份有限公司,2012年。
    [25] 倉達 慶仁,圖解數位影像處理程式範例教本使用C語言,2011年。
    [26] M. M.Aznaveh, H. Mirzaei, E. Roshan, and M. Saraee, “A new color based method for skin detection using RGB vector space,” in Proceedings of IEEE International Conference on Human System Interaction, Krakow, Poland, May 2008, pp. 932-935.

    QR CODE
    :::