跳到主要內容

簡易檢索 / 詳目顯示

研究生: 邱國鈞
Kuo-Chun Chiu
論文名稱: 追瞳系統之研製及其應用
The Development of an Eye-Tracking System and Its Applications
指導教授: 蘇木春
Mu-Chun Su
口試委員:
學位類別: 碩士
Master
系所名稱: 資訊電機學院 - 資訊工程學系
Department of Computer Science & Information Engineering
畢業學年度: 94
語文別: 中文
論文頁數: 75
中文關鍵詞: 人機介面漸凍人頭戴式攝影機追瞳系統輔具
外文關鍵詞: eye gaze, eye-tracking system, head-mount camera
相關次數: 點閱:10下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 追瞳系統是一種可以偵測人類眼睛注視方向的系統。藉由偵測人類眼睛視線,可以明確顯示使用者感興趣的地方,因此追瞳系統也被廣泛應用於人機介面之使用,可以判斷分析眼睛的疾病,同時,也可以用在人類認知學的研究。因此,眼睛視線追蹤成為近年來,成為近幾年來最具吸引力的研究之一。追蹤眼睛移動方向有很多不同的方法,例如利用光線在眼睛上造成反射的紅外線眼動圖法,或是利用生理訊號(EOG)的眼電圖法…等。然而每一種方法都有其優缺點,例如有些已經商品化的系統,他們的價錢通常都非常的昂貴;另一方面,有些系統需要貼上額外的電極片。此外,還有的系統是要求使用者的頭部,在完全不能移動的狀態下才能夠使用;除此之外,複雜的校正流程常常需要不斷的執行,在使用上造成極大的不方便。
    在本篇論文中,主要是提供一套低價位的追曈系統,透過系統的使用,可以讓身障者利用眼睛的視線來操控電腦。系統中使用兩顆低價位之視訊攝影機,將第一顆視訊攝影機架設在耳機上,並且在攝影機旁加裝兩顆紅外線LED,讓使用者戴在頭上使用,而第二顆視訊攝影機則置於電腦螢幕的下方。利用第一台視訊攝影機擷取眼睛畫面,並將擷取之影像,計算使用者眼睛注視的方向。研發的追瞳演算法,能有效的計算影像中瞳孔中心位置,並且追蹤瞳孔的移動。接著利用簡單且有效的對映方法,計算出眼睛注視的方向。第二台攝影機則是用來補償因使用者頭部移動而產生的偏移量。部分身障者如漸凍人(肌萎縮性側索硬化症),可透過這套追曈系統以及搭配研發的應用程式,可以呼叫旁人的協助,中英文文字輸入,瀏覽網際網路或是控制家中電器等功能。最後,經由實驗結果可以得知,我們的追瞳系統可以快速且精確地執行追蹤瞳孔的工作。


    Eye gaze tracking systems are systems that can estimate the gaze direction of a human user. Since eye gaze direction can express the interest of a user the applications of an eye gaze system are widely varied from human computer interaction and eye disease diagnosis to human cognition study. Therefore, eye gaze tracking has been a very attractive research topic in recent years. There are several different ways to track the directions of eye movements by refection of light and electrooculographic potential (EOG), etc. Each approach has its own advantages and disadvantages. For example, some systems are already commercially available; however, the costs of these systems usually are very expensive. On the other hand, some systems require attaching electrodes. Furthermore, some systems require the user to keep his or her head almost completely still; otherwise, a complex calibration process needs to be repeated again and again.
    In this thesis, a low cost eye gaze tracking system is proposed. With the eye gaze tracking system, people with severe disabilities can access computers via eye gaze. The system consists of two low-cost Web cameras. While the first Web camera surrounded by two infrared LEDs is attached to a headset which is worn on the user’s head the second Web camera is under the computer monitor. The images acquired by the first Web camera are used to compute eye gaze directions. An efficient gaze tracking algorithm has been developed to locate the center of pupil in images and then track the movements of the pupil. Then a simple but effective eye gaze mapping algorithm is used to compute eye gaze directions. The images acquired by the second Web camera are used to compensate head movements. Via the eye gaze tracking system and application software, people with severe disabilities, such as severe cerebral palsy or amyotrophic lateral sclerosis (ALS), can ask for helps, type texts, browse Internets, and control home appliances etc. Experimental results demonstrate the performance of the proposed system.

    摘要..................................I Abstract..............................III 誌謝..................................V 目錄..................................VII 圖目錄................................X 表目錄................................XIII 第一章 緒論..........................1 1.1 研究動機.........................1 1.2 研究目的.........................2 1.3 論文架構.........................3 第二章 追瞳系統研究之介紹與探討......4 2.1 搜尋線圈法.......................4 2.2 紅外線視訊系統...................5 2.3 眼電圖法.........................6 2.4 紅外線眼動圖法...................7 2.5 光學式瞳位追蹤系統...............8 2.6 Purkinje 影像追蹤法..............9 第三章 研究方法與步驟................12 3.1 瞳孔中心偵測.....................15 3.1.1 平滑濾波器.....................15 3.1.2 臨界值法.......................17 3.1.3 標號演算法.....................19 3.1.4 計算瞳孔中心...................21 3.2 系統參數值調整...................22 3.2.1 調整運算畫面大小...............22 3.2.2 調整臨界值法閥值...............22 3.2.3 紀錄瞳孔平均大小...............23 3.2.4 眨眼判斷.......................23 3.3 瞳孔中心座標轉換.................24 3.3.1 方程式轉換.....................24 3.3.2 利用調和比(Cross Ratio)......26 3.3.3 線性網格點映射法...............28 3.4 移動偵測及校正...................31 3.4.1 移動偵測.......................31 3.4.2 移動修正.......................32 第四章 追瞳系統之人機介面介紹........35 4.1 硬體環境介紹.....................35 4.2 系統操作流程.....................36 4.3 系統主要功能.....................42 4.3.1 中文語音資料庫.................43 4.3.2 中英文輸入系統.................45 4.3.3 影音播放系統...................48 4.3.4 尋求遠端協助系統...............51 4.3.5 傳送訊息系統...................53 4.3.6 網際網路瀏覽系統...............56 4.3.7 家電控制系統...................58 第五章 實驗結果......................61 5.1 瞳孔偵測實驗.....................61 5.2 映射方法比較.....................63 5.3 打字測試.........................66 5.4 打字時間測試.....................66 5.5 水平移動修正測試.................67 第六章 結論與未來展望................68 6.1 結論.............................68 6.2 未來展望.........................69 參考文獻..............................71

    [1] J. L. Andreassi, Psychophysiology: Human Behavior and Physiological Response, Third Edition. Hillsdale, NJ: Lawrence Erlbaum, 1995.
    [2] M. Betke, J. Gips, and P. Fleming, “The camera mouse: visual tracking of body feature to provide computer access for people with severe disabilities,” IEEE Transaction on Neural Systems and Rehabilitation Engineering, vol. 10, no. 1, pp. 1-10, 2002
    [3] R. H. S. Carpenter, Movements of the eyes, Aug. 1988
    [4] L. S. Di and A. Bulgarelli, “A simple and efficient connected components labeling algorithm,” IEEE Proceedings of Image Analysis and Processing, pp. 322-327, Sep. 1999.
    [5] J. Gang and E. Sung, “Study on Eye Gaze Estimation,” Transaction on Systems, Man and Cybernetics, Part B, vol. 32, no. 3, pp. 332-350, June 2002.
    [6] A. J. Glenstrup and T. E. Nielsen, "Eye Controlled Media: Present and Future State", Thesis of Bachelor in Information Psychology, Psychological Laboratory, University of Copenhagen, Denmark, 1995.
    [7] T. E. Hutchinson, K. P. White, Jr., W. N. Martin, K. C. Reichert, and L. A. Frey, “Human-Computer Interaction Using Eye-Gaze Input,” IEEE Transaction on Systems, Man and Cybernetics, vol. 19, no. 6, pp. 1527 – 1534, Dec. 1989.
    [8] R. L. Hsu, M. A. Mottaleb, and A. K. Jain, “Face detection in color images,” IEEE Transactions on Pattern Analysis and Machine Intell., vol. 24, pp. 696-706, 2002.
    [9] S. T. Iqbal, X. S. Zheng, and B. P. Bailey, “Task-evoked pupillary response to mental workload in human-computer interaction,” Conference on Human Factors in Computing Systems, pp. 1477–1480, 2004.
    [10] D. R. Iskander, S. Mioschek, M. Trunk, and W. Werth, “Detecting eyes in digital images,” Proceedings of Signal Processing and Its Applications, vol. 2, pp. 21-24, July 2003.
    [11] Q. Ji, Z. Zhu, and P. Lan, “Real-Time Nonintrusive Monitoring and Prediction of Driver Fatigue,” IEEE Transactions on Vehicular Technology, vol. 53, no. 4, pp. 1052-1068, July 2004.
    [12] C. S. Kehara and M. E. Crosby, “Assessing Cognitive Load with Physiological Sensors,” IEEE Proceedings of the 38th Hawaii International Conference on System Sciences, pp. 295a-295a, Jan. 2005.
    [13] R. V. Kenyon, ”A Soft Contact Lens Search Coil for Measuring eye Movements”, Vision Res. vol. 25, no. 11, pp. 1629-1633, 1985.
    [14] S. M. Kim, M. Sked, and Q. Ji, “Non-intrusive eye gaze tracking under natural head movements,” Proceedings of Engineering in Medicine and Biology Society, vol. 3, pp. 2271-2274, 2004.
    [15] D. Kumar and E. Poole, “Classification of EOG for human computer interface,” Conference in the Second Joint EMBS/BMES, vol. 1, pp. 23-26, Oct. 2002.
    [16] C. S. Lin, H. T. Chen, T. G. Lin, M. S. Yeh, and C. L. Tien, “Development and Application of An Infrared Eye-mouse Control System,” Journal of Medical and Biological Engineering, pp. 15-19, Oct. 2005.
    [17] C. S. Lin, C. C. Huan, C. N. Chan, M. S. Yeh, and C. C. Chiu, “Design of a computer game using an eye-tracking device for eye''s activity rehabilitation,” Optics and Lasers in Engineering, pp. 91-108, July 2004.
    [18] C. H. Morimoto, D. Koons, A. Amit, M. Flickner, and S. Zhai, “Keeping an eye for HCI,” Processing of XII Brazilian Symp. Computer Graphics and Image, pp. 171-176, 1999.
    [19] G. Norris and E. Wilson, “The Eye Mouse, An Eye Communication Device”, IEEE Proceedings of Bioengineering, pp. 66–67, May 1997.
    [20] K. S. Park and K. T. Lee, “Eye-controlled human computer interface using the line-of-sight and the intentional blink,” Computer Engineering, vol. 30, no.3, pp. 463-473, 1996.
    [21] T. Partala, M. Jokiniemi, and V. Surakka, “Pupillary responses to emotionally provocative stimuli,” Proceedings of the 2000 symposium on Eye tracking research and applications, pp. 123–129, 2000.
    [22] T. Partala and V. Surakka, “Pupil size variation as an indication of affective processing,” International Journal of Human-Computer Studies, vol. 59, pp. 185-198, July 2003.
    [23] Y. Tomita, Y. Igarashi, S. Honda, and N. Matsuo, “Electro-Oculography Mouse for Amyotrophic Lateral Sclerosis Patients,” IEEE Conference Engineering in Medicine and Biology Society, vol. 531, pp. 1708-1781 Nov. 1996.
    [24] L. Young and D. Sheena, “Survey of eye movement recording methods,” Behav. Res. Meth. Instrum, vol. 7, no. 5, pp.397-429, 1975.
    [25] D. H. Yoo, J. H. Kim, B. R. Lee, and M. J. Chung, “Non-contact eye gaze tracking system by mapping of corneal reflections,” 2002. Proceedings. Fifth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 94-99, May 2002.
    [26] Z. Zhu and Q. Ji, “Eye gaze tracking under natural head movements,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 918-925, June 2005.
    [27] LC Technologies, INC. Available: http://www.eyegaze.com/INDEX.htm. June 28 2006[data accessed]
    [28] SR Research EyeLink. Available: http://www.eyelinkinfo.com/mount_main.php June 28 2006[data accessed]
    [29] Available: http://neurology.med.upenn.edu/%7Esolomon/NewFiles/Frames/pictures/p_eyecage.html June 28 2006[data accessed]
    [30] 吳成柯、戴善榮、程湘君和雲立實譯,「數位影像處理技術」,儒林圖書有限公司,民國九十年十月。
    [31] 林宸生、林南州、簡志忠,「光學式瞳位追蹤器之人機介面系統研製」,逢甲大學自動控制工程研究所碩士論文,民國八十六年。
    [32] 莊英杰,「追瞳系統之研發於身障者之人機介面應用」,中央大學資訊工程研究所碩士論文,民國九十三年六月。
    [33] 張凱傑,「眼控與頭控之人機介面系統研發與整合」,逢甲大學自動控制工程研究所碩士論文,民國九十年。
    [34] 郭靜男,「可眼控及頭控之多功能PC Camera之研發與應用」,逢甲大學自動控制工程研究所碩士論文,民國九十二年五月。
    [35] 陳晏輝,「利用眼球運動發展肢體障礙者之人機介面」,國立成功大學醫學工程研究所碩士論文,民國八十七年。
    [36] 陳弦澤,「改良式紅外線眼控系統之研發與應用」,逢甲大學自動控制工程研究所碩士論文,民國九十三年五月。
    [37] 詹永舟,「瞳位追蹤應用於眼控系統及眼球動態量測儀器之製作與分析」,逢甲大學自動控制工程研究所碩士論文,民國八十七年。
    [38] 蔡金源,「以眼球控制之殘障者人機介面系統:紅外線視動滑鼠」,國立台灣大學,電機工程研究所碩士論文,民國八十六年。
    [39] 李佳霖,「頭配式顯示器之人體生理參數量測分析」,私立中原大學,醫學工程研究所碩士論文,民國九十年六月。
    [40] 謝文元著作,「殘障者人機介面系統:紅外線眼控電腦滑鼠系統之研製」,國立台灣大學電機工程研究所碩士論文,民國八十八年。

    QR CODE
    :::