跳到主要內容

簡易檢索 / 詳目顯示

研究生: 鄭博仁
Bo-Ren Zheng
論文名稱: 以核方法化的相關濾波器之物件追蹤方法實作眼動儀系統
Implementation of an eye-tracking system using Kernelized Correlation Filter.
指導教授: 栗永徽
Yung-Hui Li
口試委員:
學位類別: 碩士
Master
系所名稱: 資訊電機學院 - 資訊工程學系
Department of Computer Science & Information Engineering
論文出版年: 2016
畢業學年度: 105
語文別: 中文
論文頁數: 66
中文關鍵詞: 眼動儀相關濾波器核方法
外文關鍵詞: eye-tracking, correlation filter, Kernel method
相關次數: 點閱:13下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 近年來,眼動儀已經是一個常用於心理學分析、疾病分析、廣告配置分析等等領域的一套設備。在本研究中,我們自製了一套穿戴式的眼動儀系統。使用Microsoft HD-6000 相機,改造成可清楚拍攝虹膜與瞳孔邊界的紅外光相機,搭配專用連接器可將攝影機置於眼鏡的鏡框上,讓攝影機能夠拍攝到眼睛的紅外光影像。眼動儀系統最主要的功能是能夠正確的偵測瞳孔的位置,本研究方法使用了以核方法的相關濾波器之物件追蹤演算法實作瞳孔追蹤功能,使用追蹤演算法找出大致上的瞳孔中心位置,並且搭配圓形擬合方法找出正確的瞳孔圓心及半徑,在搭配投影轉換,將瞳孔的位置轉換到實際所看的螢幕位置。在實驗結果中,與手動定位的瞳孔圓心的誤差平均只有2.02 個pixel,半徑大小的誤差為1 個pixel,且在執行速度上,處理一張影像只需要0.0295 秒,相當於每秒可執行33.95 張影像,執行速度超過一般攝影機所能提供的每秒30 張影像,是一套計算快速且準確的眼動儀系統。


    In recent years, eye-tracking is already used in areas like psychology, human-computer interface and e-learning. In this study we made a wearable eye-tracking system.We hand-made a IR camera by modifying a commercially available webcam(MS HD-6000) and mounted it to a customized glasses frame. Such device is a wearable eye-tracking system which is able to record a clear video of eye movement when the user wears the glasses frame. The most important feature of an effective eye-tracker is to locate and track the pupil movement correctly in realtime. In this research, we used Kernelized Correlation Filter to implement pupil location tracking. By using KCF and a self-developed circle-fitting algorithm, we are able to detect and track the pupil location accurately. In experimental results,compare manual and automatic detection of the pupil center, the average error of manual and automatic detection pupil center is 2.02 pixel, and the average error ofpupil radius is 1.1 pixel. In terms of execution speed, it only takes 0.0295 second to process an image, which is equivalent to 33.9 FPS (frame per second). Therefore,our eye-tracking system is fast enough to fulfill the real-time requirement and is ready to be used in many practical situations.

    中文摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . i 英文摘要. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii 目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii 圖目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi 表目錄. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii 一、緒論. . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1-1 研究背景. . . . . . . . . . . . . . . . . . . . . . . . . 1 1-2 研究動機. . . . . . . . . . . . . . . . . . . . . . . . . 3 1-3 論文架構. . . . . . . . . . . . . . . . . . . . . . . . . 4 二、眼動儀介紹. . . . . . . . . . . . . . . . . . . . . . . . 5 2-1 眼動儀簡介與演算法. . . . . . . . . . . . . . . . . . . 5 2-2 眼動儀系統比較. . . . . . . . . . . . . . . . . . . . . 6 2-3 硬體介紹. . . . . . . . . . . . . . . . . . . . . . . . . 8 2-3-1 移除紅外光濾鏡. . . . . . . . . . . . . . . . . . . . . 9 2-3-2 紅外光源建置. . . . . . . . . . . . . . . . . . . . . . 10 三、眼動儀系統核心演算法. . . . . . . . . . . . . . . . . 13 3-1 演算法簡介. . . . . . . . . . . . . . . . . . . . . . . . 13 3-1-1 Kernel type . . . . . . . . . . . . . . . . . . . . . . . . 14 1. Gaussian Kernel . . . . . . . . . . . . . . . . . . . . . . 15 2. Polynomial Kernel . . . . . . . . . . . . . . . . . . . . 15 3-1-2 Feature type . . . . . . . . . . . . . . . . . . . . . . . . 16 1. 方向梯度直方圖. . . . . . . . . . . . . . . . . . . . . 16 3-2 Kernelized Correlation Filters . . . . . . . . . . . . . . 19 3-2-1 Linear regression . . . . . . . . . . . . . . . . . . . . . 19 3-2-2 Circulant Shifts and Circulant Matrix . . . . . . . . . . . 20 3-2-3 Kernel trick . . . . . . . . . . . . . . . . . . . . . . . . 22 3-2-4 Detection . . . . . . . . . . . . . . . . . . . . . . . . . 24 3-2-5 Kernel Correlation . . . . . . . . . . . . . . . . . . . . 25 3-3 偵測瞳孔大小. . . . . . . . . . . . . . . . . . . . . . 26 3-4 圓形擬合方法. . . . . . . . . . . . . . . . . . . . . . 27 3-5 程式流程. . . . . . . . . . . . . . . . . . . . . . . . . 28 四、系統介紹. . . . . . . . . . . . . . . . . . . . . . . . . 32 4-1 校正. . . . . . . . . . . . . . . . . . . . . . . . . . . . 32 4-2 螢幕投影方法. . . . . . . . . . . . . . . . . . . . . . 32 4-2-1 Affine Transform . . . . . . . . . . . . . . . . . . . . . 33 4-2-2 二次線性回歸. . . . . . . . . . . . . . . . . . . . . . 35 4-3 系統實作與介面設計. . . . . . . . . . . . . . . . . . . 35 五、實驗結果. . . . . . . . . . . . . . . . . . . . . . . . . 40 5-1 Database . . . . . . . . . . . . . . . . . . . . . . . . . . 40 5-2 瞳孔定位分析. . . . . . . . . . . . . . . . . . . . . . 41 5-2-1 Ground truth 誤差. . . . . . . . . . . . . . . . . . . . . 41 5-2-2 K-means Integration of Radial Difference(KIRD) . . . . 42 5-2-3 瞳孔定位實驗結果分析. . . . . . . . . . . . . . . . . 45 5-3 螢幕投影. . . . . . . . . . . . . . . . . . . . . . . . . 48 六、結論與未來展望. . . . . . . . . . . . . . . . . . . . . 51 6-1 結論. . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 6-2 未來展望. . . . . . . . . . . . . . . . . . . . . . . . . 51 索引. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 參考文獻. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53

    [1] Sabine Heuer and Brooke Hallowell,A novel eye-tracking method to assess
    attention allocation in individuals with and without aphasia using a dual-task
    paradigm ,Journal of Communication Disorders, 2015,vol. 55,pp. 15-30
    [2] Corine S. Meppelink, Nadine Bol, Exploring the role of health literacy on
    attention to and recall of text-illustrated health information: An eye-tracking
    study, Computers in Human Behavior, Volume 48, July 2015, Pages 87-93,
    ISSN 0747-5632.
    [3] Shu-Fei Yang, An eye-tracking study of the Elaboration Likelihood Model in
    online shopping, Electronic Commerce Research and Applications, Volume
    14, Issue 4, July–August 2015, Pages 233-240, ISSN 1567-4223,
    [4] Tamara van Gog, Katharina Scheiter, Eye tracking as a tool to study and
    enhance multimedia learning, Learning and Instruction, Volume 20, Issue 2,
    April 2010, Pages 95-99, ISSN 0959-4752.
    [5] Peter M. Corcoran, Florin Nanu, Stefan Petrescu and Petronel Bigioi, Realtime
    eye gaze tracking for gaming design and consumer electronics systems,
    in IEEE Transactions on Consumer Electronics, vol. 58, no. 2, pp. 347-355,
    May 2012. doi: 10.1109/TCE.2012.6227433
    [6] Dong-Chan Cho, Wah-Seng Yap, HeeKyung Lee, Injae Lee and Whoi-Yul
    Kim, Long range eye gaze tracking system for a large screen, in IEEE Trans-
    53
    actions on Consumer Electronics, vol. 58, no. 4, pp. 1119-1128, November
    2012. doi: 10.1109/TCE.2012.6414976
    [7] Chul Woo Cho, Ji Woo Lee, Kwang Yong Shin, Eui Chul Lee, Kang Ryoung
    Park, Heekyung Lee, and Jihun Cha, Gaze Detection by Wearable Eye-
    Tracking and NIR LED-Based Head-Tracking Device Based on SVR, ETRI
    Journal, vol. 34, no. 4, Aug. 2012, pp. 542-552.
    [8] Ruian Liu, Xin Zhou, Nailin Wang, Mimi Zhang,Adaptive regulation of CCD
    camera for real time eye tracking, Multimedia Tools and Applications, vol.
    52, no. 1, 2011, pp.33-43
    [9] Eui Chul Lee and Min Woo Park, A New Eye Tracking Method as a Smartphone
    Interface KSII Transactions on Internet and Information SystemsVol.
    7, No.4, April 30, 2013
    [10] Alper Yilmaz, Omar Javed, and Mubarak Shah.Object tracking: A survey.
    ACM Comput. Surv. vol. 38,Issue 4, no. 13, December 2006.
    [11] Wenhao Zhang, Melvyn L. Smith, Lyndon N. Smith, Abdul Farooq, Eye center
    localization and gaze gesture recognition for human–computer interaction,
    in Journal of the Optical Society of America, March 2016.
    [12] Seung-Jin Baek, Kang-A Choi, Chunfei Ma, Young-Hyun Kim, and Sung-
    Jea Ko, Eyeball model-based iris center localization for visible image-based
    eye-gaze tracking systems, in IEEE Transactions on Consumer Electronics,
    vol. 59, no. 2, pp. 415-421, May 2013, doi: 10.1109/TCE.2013.6531125
    [13] Yan Shen, Hak Chul Shin, Won Jun Sung, Sarang Khim, Honglak Kim, Phill
    Kyu Rhee, Evolutionary adaptive eye tracking for low-cost human computer
    54
    interaction applications. J. Electron. Imaging. 0001;22(1):013031-013031.
    doi:10.1117/1.JEI.22.1.013031.
    [14] Stylianos Asteriadis, Dimitris Soufleros, Kostas Karpouzis, and Stefanos
    Kollias. A natural head pose and eye gaze dataset. In Proceedings of the
    International ACM, New York, NY, USA, Article 1 , 4 pages.
    [15] Kang-A Choi, Chunfei Ma, and Sung-Jea Ko, Improving the usability of
    remote eye gaze tracking for human-device interaction, in IEEE Transactions
    on Consumer Electronics, vol. 60, no. 3, pp. 493-498, Aug. 2014. doi:
    10.1109/TCE.2014.6937335
    [16] David Sliney, Danielle Aron-Rosa, Francois DeLori, Franz Fankhauser,
    Robert Landry, Martin Mainster, John Marshall, Bernard Rassow, Bruce
    Stuck, Stephen Trokel, Teresa Motz West, and Michael Wolffe, Adjustment
    of guidelines for exposure of the eye to optical radiation from ocular instruments:
    statement from a task group of the International Commission on Non-
    Ionizing Radiation Protection (ICNIRP) Appl. Opt. 44, 2162-2176, 2005.
    [17] Donald K. Martin and Brien A. Holden, A New Method for Measuring the
    Diameter of the in Vivo Human Cornea,Article in American journal of optometry
    and physiological optics 59(5):436-41, June 1982.
    [18] João F. Henriques, Rui Caseiro, Pedro Martins, and Jorge Batista, High-speed
    tracking with kernelized correlation filters, Pattern Analysis and Machine
    Intelligence, IEEE Transactions on, vol. 37, no. 3, pp. 583–596, 2015.
    [19] James Mercer, Functions of Positive and Negative Type, and their Connection
    with the Theory of Integral Equations.Philosophical Transactions of the
    Royal Society of London, vol. 209, pp.415-446, January 1909.
    55
    [20] Aizerman, M. A. and Braverman, E. A. and Rozonoer, L., Theoretical foundations
    of the potential function method in pattern recognition learning. Automation
    and Remote Control, vol. 25, pp.821-837,1964
    [21] Dalal, Navneet, and Bill Triggs. Histograms of oriented gradients for human
    detection. Computer Vision and Pattern Recognition, 2005. CVPR 2005.
    IEEE Computer Society Conference on. Vol. 1. IEEE, 2005.
    [22] Matthew A. Turk, Face recognition using eigenfaces, Computer Vision and
    Pattern Recognition, 1991. Proceedings CVPR’ 91., IEEE Computer Society
    Conference on, Jun. 1991
    [23] Peter N. Belhumeur, João P. Hespanha, and David J. Kriegman, Eigenfaces
    vs. Fisherfaces: recognition using class specific linear projection,Pattern
    Analysis and Machine Intelligence, IEEE Transactions on , vol.19, no.7, pp.
    711-720, Jul 1997 doi: 10.1109/34.598228
    [24] Ryan Rifkin, Gene Yeo and Tomaso Poggio, Regularized least-squares classification,
    Nato Science Series Sub Series III Computer and Systems Sciences,
    vol. 190, pp. 131–154, 2003.
    [25] Robert M. Gray, Toeplitz and Circulant Matrices: A Review. NowPublishers,
    2006.
    [26] Maycas Nadal, Catlos. Input-Output Kernel Regression applied to proteinprotein
    interaction network inferenc. 2015. PhD Thesis.
    [27] http://cc.ee.ntu.edu.tw/~ultrasound/belab/midterm_oral_
    files/2011_100_1/100-1-mid-9.pdf
    [28] Dapper Vision, Inc..Eye Tracking. http://www.wearscript.com/en/
    latest/eyetracking.html
    56
    [29] http://www.vishay.com/ir-emitting-diodes/
    [30] http://www.clspectrum.com/articleviewer.aspx?articleid=
    12852
    [31] http://www.clspectrum.com/articleviewer.aspx?articleid=
    12852
    [32] 行政院研究發展考核委員會, 紅外線及低頻電磁場量測方法建立研究,
    中華民國政府出版品,2008.
    [33] 連翊展,AILIS: An Adaptive and Iterative Learning Method for Accurate Iris
    Segmentation, 2016,master’s thesis.

    QR CODE
    :::