跳到主要內容

簡易檢索 / 詳目顯示

研究生: 鄭貴隆
Guei-Long Cheng
論文名稱: 以感測器為基礎之開關於身障者之溝通輔具的應用
The Application of Sensor-based Switches in a Communication Aid for Disabled People
指導教授: 蘇木春
Mu-Chun Su
口試委員:
學位類別: 碩士
Master
系所名稱: 資訊電機學院 - 資訊工程學系
Department of Computer Science & Information Engineering
論文出版年: 2015
畢業學年度: 103
語文別: 中文
論文頁數: 69
中文關鍵詞: 漸凍人追瞳系統輔助系統紅外線感測器顏色感測器
相關次數: 點閱:17下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 漸凍人隨著時間會有各種程度的運動能力退化,除了使得漸凍人的生活品質大受影響之外,與外界的溝通也都受到不同程度的阻礙。這個情形不但讓病患及家屬的經濟負擔更為沉重之外,長期照顧病患的工作也會使得家屬及醫護人員感到疲憊不堪。因此,我們需要透過輔助系統讓病患可以提高生活自主性,使得他們不需要凡事都得依靠別人。
    現今已有數種追瞳系統問市,各有其優缺點。基本上,追瞳系統都是透過影像方式來追蹤眼球,估算出人眼凝視螢幕的位置後,藉由座標對應之計算,進而控制滑鼠之移動與下達操控命令。然而,由於追瞳系統需透過人眼的凝視進行操控,人在長時間盯著螢幕的情況下,會增加眼睛的疲勞度,也有可能造成傷害。另外,漸凍人每隔一段時間,就必須被翻身以避免褥瘡,當翻身後,眼睛與追瞳系統的相對位置可能已經被改變了,此時就必須再重新進行系統的校正設定,這種不夠友善的設計會讓病人不願意採用。此外,追瞳系統通常價位都很高,非一般人可以負擔得起。
    基於上述理由,又為了滿足病患運動退化的差異,並考慮漸凍人僅剩可運動之部位(例如:嘴巴、眼睛及眉毛等),本論文提出一套以感測器為基礎的輔助系統,利用紅外線感測器、顏色感測器與攝影機三種感測器作為選取開關,並搭配可運作於Android平台的溝通系統,透過簡單的架設,就可讓病人進行溝通系統之操作。透過這些感測器,病人可以藉由嘴巴的開闔、眼睛的開闔或眉毛的挑動來點選循輪式的選單,達到病人的需求表達的目的。本論文所提之各類感測器開關的有效性皆有經過實驗的設計來驗證。


    Amyotrophic lateral sclerosis will have degeneration in exercising. In addition to their living quality, they have difficulty communicating with others at every level. This situation become patients and their family’s burden and it also makes family and medical personnel doctors feel tired when they have to spend long time caring patients. We need to use aid system to make patients enhance their self-care ability and they don’t need to depend on others in everything.
    Nowadays there are several Eye-Tracking Systems on the market, and they have advantages and disadvantages. Basically, these systems track eyeball through image. The systems calculate the location where eyes watch the screen. The system can control the mouse and give orders through the coordinates. However, because Eye-Tracking Systems work by catching the gazing, it will cause eyes fatigue and damages. Otherwise, people have to turn over amyotrophic lateral sclerosis to avoid pressure sore. When they turn over, the relative position between eyes and system may be different and have to be corrected again. The patients may not be willing to use this kind of unfriendly design. Besides, the price of Eye-Tracking systems is usually very high, and average people can’t afford it.
    In view of above mentioned reasons and to satisfy the difference of degeneration in the patients’ capability to act and also consider amyotrophic lateral sclerosis’s remaining active body. For example, the mouth, eyes and eyebrows, etc. This paper proposes a system based on the sensor with infrared sensor, color detecting system and camera as selective button. The system also uses communication system which can be applied to Android platform and it can let patients manipulate communication system with easy construction. Patients can pick rotational list through opening and closing the mouse and eyes or move eyebrow to express their needs. The effectiveness of every sensor in this essay has all been verified by experiment.

    摘要 I ABSTRACT III 誌謝 095849589V 目錄 VII 圖目錄List of Figures IX 表目錄List of Tables XI 第一章、緒論 1 1.1 研究動機 1 1.2 研究目的 3 1.3 論文架構 3 第二章、相關研究 5 2.1 追瞳系統 5 2.1.1 搜尋線圈法 6 2.1.2紅外線視訊系統 7 2.1.3 眼電圖法 7 2.1.4 紅外線眼動圖法 9 2.1.5 光學式瞳位追蹤系統 10 2.1.6 Purkinje 影像追蹤法 11 2.2影像系統 11 2.2.1 眨眼點選 12 2.2.2 唇動點選 13 第三章、感測器介紹 16 3.1 紅外線感測器 16 3.2 顏色感測器 19 3.3 PIXY攝影模組 22 3.4 感測器支架 25 3.4.1 雲台架 26 3.4.2 感測器固定支架 27 3.4.3 支架固定座 28 3.4.4 支架關節 29 第四章、溝通輔具系統介面 31 4.1 系統操作流程 31 4.2 循環式選單 34 4.3 系統功能 38 4.3.1 中文語音資料庫 38 4.3.2 文字輸入 41 4.3.3 家電控制 42 4.3.4 傳送訊息 44 第五章、實驗設計與結果 46 5.1 紅外線感測器 46 5.2 顏色感測器 47 5.3 PIXY攝影模組 49 第六章、結論與未來展望 51 6.1 結論 51 6.2 未來展望 52 參考文獻 53

    「中華民國運動神經元疾病病友協會」, [Online]. Available: http://www.mnda.org.tw/。
    [2] C. H. Morimoto, D. Koons, A. Amit, M. Flickner, and S. Zhai, “Keeping an eye for HCI,” in Processing of XII Brazilian Symp. Computer Graphics and Image, pp. 171–176, 1999.
    [3] M. Betke, J. Gips, and P. Fleming, “The Camera Mouse: Visual tracking of body features to provide computer access for people with severe disabilities,” IEEE Trans. on Neural Systems and Rehabilitation Engineering, vol. 10, no. 1, pp. 1–10, 2002.
    [4] D. Kumar and E. Poole, “Classification of EOG for HCI.pdf,” in Conference in the Second Joint EMBS/BMES, pp. 64–67, 2002.
    [5] K. S. P. and K. T. Lee, “Eye-controlled human computer interface using the line-of-sight and the intentional blink,” Computer Engineering,, vol. 30, no. 3, pp. 463–473, 1996.
    [6] L. R. Young and D. Sheena, “Survey of eye movement recording methods,” Behavior Research Methods&Instrumentation, vol. 7, no. 5, pp. 397–429, 1975.
    [7] J. G. Wang and E. Sung, “Study on eye gaze estimation,” IEEE Trans. Syst. Man, Cybern. Part B Cybern., vol. 32, no. 3, pp. 332–350, 2002.
    [8] Z. Zhu and Q. Ji, “Eye Gaze Tracking under Natural Head Movements,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 918–923, 2005.
    [9] R. V. Kenyon, “A soft contact lens search coil for measuring eye movements.,” Vision Res., vol. 25, no. 11, pp. 1629–1633, 1985.
    [10] C. S. Lin, C. C. Huan, C. N. Chan, M. S. Yeh, and C. C. Chiu, “Design of a computer game using an eye-tracking device for eye’s activity rehabilitation,” Optics and Lasers in Engineering, vol. 42, no. 1, pp. 91–108, 2004.
    [11] LC Technologies, INC. [Online]. Available:
    http://www.eyegaze.com/.
    [12] 簡志忠, 「光學式瞳位追蹤器之人機介面系統研製」, 逢甲大學自動控制工程研究所碩士論文, 民國八十六年。
    [13] 蔡金源,「以眼球控制之殘障者人機介面系統:紅外線式動滑鼠」國立台灣大學電機工程研究所碩士論文,民國八十六年。
    [14] 葉錦諺, 「以影像為基礎之重度身障者人機介面」, 國立中央大學資訊工程研究所碩士論文, 民國九十七年六月。
    [15] G. Norris and E. Wilson, “The Eye Mouse, an eye communication device,” Proc. IEEE 23rd Northeast Bioeng. Conf., pp. 66–67, 1997.
    [16] Y. Tomita, Y. Igarashi, S. Honda, and N. Matsuo, “Electro-Oculography Mouse for Amyotrophic Lateral Sclerosis Patients,” in IEEE Conference Engineering in Medicine and Biology Society, vol. 5, pp. 1780–1781, 1996.
    [17] 李佳霖, 「頭配式顯示器之人體生理參數量測分析」, 中原大學醫學工程研究所碩士論文, 民國九十年六月。
    [18] R. L. Hsu, M. A. Mottaleb, and A. K. Jain, “Face detection in color images,” IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, no. 5, pp. 696–706, 2002.
    [19] 陳晏輝, 「利用眼球運動發展肢體障礙者之人機介面」, 國立成功大學醫學工程研究所碩士論文, 民國八十七年。
    [20] 張凱傑, 「眼控與頭控之人機介面系統研發與整合」, 逢甲大學自動控制工程研究所碩士論文,民國八十九年。
    [21] 郭靜男, 「可眼控及頭控之多功能PC Camera 之研發與應用」, 逢甲大學自動控制工程研究所碩士論文, 民國九十二年五月。
    [22] 詹永舟, 「瞳位追蹤應用於眼控系統及眼球動態量測儀器之研作與分析」, 逢甲大學自動控制工程研究所碩士倫文, 民國八十七年。
    [23] 林道祥, 「眼波與眼影的融合及同步紀錄系統」, 國立成功大學電機工程研究所碩士論文, 民國八十二年。
    [24] T. N. Bhaskar, F. T. Keat, S. Ranganath, and Y. V. Venkatesh, “Blink detection and eye tracking for eye localization,” Conf. Converg. Technol. Asia-Pacific Reg., vol. 2, no. 3, pp. 821–824, 2003.
    [25] M. J. Black, D. J. Fleet, and Y. Yacoob, “A framework for modeling appearance change in image sequences,” in International Conference on Computer Vision, pp. 660–667, 1998.
    [26] J. Bouguet, “Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the algorithm,” Intel Corporation Microprocessor Research Labs, vol. 1, no. 2, pp. 1–9, 1999.
    [27] R. Heishman and Z. Duric, “Using image flow to detect eye blinks in color videos,” in IEEE Workshop on Applications of Computer Vision, 2007.
    [28] 「凱斯電子科技有限公司」, [Online]. Available:
    http://www.kaise.com.tw/.
    [29] “VISHAY.” [Online]. Available: www.vishay.com.
    [30] “adafruit.” [Online]. Available: https://www.adafruit.com/.
    [31] “CMUcam.” [Online]. Available: http://cmucam.org/.
    [32] 「Google文字轉語音」, [Online]. Available:
    https://play.google.com/store/apps/details?id=com.google.android.tts&hl=zh_TW.
    [33] 「Ekho語音合成」, [Online]. Available:
    https://play.google.com/store/apps/details?id=net.eguidedog.ekho.cantonese.

    QR CODE
    :::