跳到主要內容

簡易檢索 / 詳目顯示

研究生: 陳奕昕
Yi-Xin Chen
論文名稱: 具微表情之機器人在情境學習下對餐旅教育的影響
The impacts of a robot with miro expressions under situational learning on hospitality education
指導教授: 陳國棟
Gwo-dong Chen
口試委員:
學位類別: 碩士
Master
系所名稱: 資訊電機學院 - 資訊工程學系
Department of Computer Science & Information Engineering
論文出版年: 2021
畢業學年度: 109
語文別: 中文
論文頁數: 88
中文關鍵詞: 情境學習數位學習劇場情緒互動微表情社交機器人人機互動
外文關鍵詞: Situated learning, Digital learning theater, Emotional interaction, Micro expression, Social robot, Human-robot interaction
相關次數: 點閱:15下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 利用情境學習進行餐旅教育已成為現在的趨勢。餐旅學習需要理論與實踐互相配合,使得學生在理解理論的同時,也要能夠在實際的工作場所中,利用他們所學去服務顧客。過去研究指出在服務過程中,服務人員的情緒互動能力往往會是成功的關鍵。例如客人對於服務不滿時,大部分的人並不會進行投訴或表達出來,只是默默離開。但是,客人可能在掩飾不滿時流露出微表情,因此學習判讀客人的微表情並即時做出改進對服務人員非常重要。然而,演出微表情對非專業演員的學生而言非常困難,因此難以利用情境學習。因此,本研究提出一個具備微表情展示系統的機器人,使其能夠綁定虛擬情境學習系統,讓學生與此微表情機器人一起在虛擬情境中互動,並學習觀察微表情以及做出情境回應。本研究之有效研究對象共60人,皆為餐旅系大學生。實驗後發現,利用微表情機器人配合情境學習的組別,對於微表情判讀能力以及情境回應能力皆有顯著提升。


    Using situated learning on hospitality training is becoming a common situation. Hospitality training should combine theories and practice, so students can learn the theories while realize them by serving customers. Researches has point out that the emotional interaction ability is the key to successful for service staff. For example, when customers feel dissatisfaction, most of them will not show their emotions or complain about it, just leaving quietly. But, when customers try to count down and cover their true emotion, a leakage of emotion called micro-expression will still appear on their faces, so learn to read micro-expressions and how to reply to these emotions are very important to staffs. Therefore, this research offered a robot with micro-expressions display system, this robot can connect to digital learning platform, then students can learn how to recognize micro-expressions and give feedbacks by interact with this micro-expression robot under the digital situated learning environment. The number of effect participants is 60, all of them are college students majored in hospitality. In conclusion, the learning outcome of students learning by using robot with micro-expressions under digital situated learning environment are significantly better than control group.

    摘要 1 Abstract 2 致謝 3 目錄 4 圖目錄 6 表目錄 8 一、 緒論 1 1-1. 研究背景 1 1-2. 研究動機 3 1-3. 研究目的 4 二、 相關研究 5 2-1. 數位學習與影像辨識在教育上的應用 5 2-2. 社交機器人與其學習上的應用 6 2-3. 情緒互動與機器人的人機互動模式 7 2-4. 數位學習劇場 10 三、 系統設計與實作 12 3-1. 系統設計理念 12 3-2. 系統架構與設計 12 3-2-1. 整體系統架構 12 3-2-2. 系統開發環境 13 3-3. 系統實作 14 3-3-1. 劇本編輯APP 14 3-3-2. 機器人的微表情 19 3-3-3. 數位學習劇場 24 3-3-4. 機器人具微表情的情緒互動程式 25 3-4. 系統的線上模式及實體授課模式 26 四、 實驗設計 27 4-1. 實驗假設 27 4-2. 實驗對象 28 4-3. 教學設計 28 4-3-1. 教學內容 28 4-3-2. 劇本設計 28 4-4. 實驗流程 29 4-5. 研究工具 33 五、 結果分析與討論 35 5-1. 考試成績分析 35 5-1-1. 前測成績分析 35 5-1-2. 情境回應能力分析 36 5-1-3. 學習成效之討論 36 5-2. 問卷結果分析 37 5-2-1. 使用者對具微表情機器人的使用感受 37 5-2-2. 使用者對微表情學習的感受 37 5-2-3. 微表情機器人對情境互動體驗的影響 39 5-2-4. 問卷結果討論 39 六、 結論與未來研究 40 6-1. 結論 40 6-2. 未來研究 40 七、 限制 41 八、 參考資料 42 附錄一、 教學劇本內容 48 附錄二、 前測試題 50 附錄三、 後測試題 60 附錄四、 問卷 69 附錄五、 中英文對照表 71 附錄六、 英文圖表資料 72

    李佳穎(2019)。具辨識視覺互動學習形式的數位學習劇場。國立中央大學資訊工程學系碩士論文,桃園縣。
    李祥煜(2014)。實體玩具結合數位遊戲與教育學習載具之設計製造與評估研究-以英文字母辨識為例。臺北教育大學數位科技設計學系(含玩具與遊戲設計碩士班)碩士論文,台北市。
    劉又慈(2018)。教室中具觀眾參與互動的數位劇場學習系統。國立中央大學資訊工程學系碩士論文,桃園市。
    林尚樵(2018)。具虛擬演員之數位學習劇場。國立中央大學資訊工程學系碩士論文。桃園市。
    林司樺(2019)。在數位劇場情境中之教師用卡片教學互動系統。國立中央大學資訊工程學系碩士論文,桃園縣。
    林雯(譯)(2020)。圖解行為心理學。台北市: 漫遊者文化出版。(澀谷昌三,2014)
    羅元甫(2015)。教室內的數位鏡室學習劇場。國立中央大學資訊工程學系碩士論文,桃園市。
    紀亭宇(2017)。應用擴增實境技術於博物館英語字彙學習之研究。國立臺中教育大學數位內容科技學系碩士班碩士論文,台中市。
    余紹畇(2020)。數位學習劇場上自主與可重製的數位虛擬演員之設計製作與其對學習成效的影響。國立中央大學資訊工程學碩士論文。桃園市。
    鄭玉婕(2019)。在物聯網玩具與機器人的遊戲式英語學習情境中機器人作為引導角色之設計與成效評估。國立中山大學資訊管理學系研究所碩士論文,高雄市。
    溫欣慈 & 曾秋蓉. (2018). 以 Zenbo 機器人為運算思維學習載具之語音化程式語言. TANET2018 臺灣網際網路研討會, 2597-2601.
    Ahmad, S. Z., Bakar, A. R. A., & Ahmad, N. (2018). An evaluation of teaching methods of entrepreneurship in hospitality and tourism programs. The International Journal of Management Education, 16(1), 14-25.
    Andersen, C. (2004). Learning in "As-If" Worlds: Cognition in Drama in Education. Theory into Practice, 43(4), 281-286. doi:10.1207/s15430421tip4304_6
    Andrade, E. B., & Ariely, D. (2009). The enduring impact of transient emotions on decision making. Organizational Behavior and Human Decision Processes, 109(1), 1-8.
    Bacca Acosta, J. L., Baldiris Navarro, S. M., Fabregat Gesa, R., & Graf, S. (2014). Augmented Reality Trends in Education: A Systematic Review of Research and Applications. Journal of Educational Technology and Society, vol. 17, núm. 4, p. 133-149.
    Barbara Tversky (2019). Mind in motion: how action shapes thought. New York: Basic Books.
    Boulton, C. A., Kent, C., & Williams, H. T. (2018). Virtual learning environment engagement and learning outcomes at a ‘bricks-and-mortar’university. Computers & Education, 126, 129-142.
    Chang, S. C., & Hwang, G. J. (2018). Impacts of an augmented reality-based flipped learning guiding approach on students’ scientific project performance and perceptions. Computers & Education, 125, 226-239.
    Chang, Y. J., Kang, Y. S., Chang, Y. S., Liu, H. H., Wang, C. C., & Kao, C. C. (2015, October). Designing Kinect2Scratch Games to Help Therapists Train Young Adults with Cerebral Palsy in Special Education School Settings. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility (pp. 317-318)
    Cope, P., Cuthbertson, P., & Stoddart, B. (2000). Situated learning in the practice placement. Journal of advanced nursing, 31(4), 850-856.
    de Graaf, M. M. (2016). An ethical evaluation of human–robot relationships. International journal of social robotics, 8(4), 589-598
    Ekman, P. (1992). Facial expressions of emotion: an old controversy and new findings. Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 335(1273), 63-69.
    Endres, J., & Laidlaw, A. (2009). Micro-expression recognition training in medical students: a pilot study. BMC medical education, 9(1), 1-6.
    Filipowicz, A., Barsade, S., & Melwani, S. (2011). Understanding emotional transitions: the interpersonal consequences of changing emotions in negotiations. Journal of personality and social psychology, 101(3), 541–556. https://doi.org/10.1037/a0023545
    Garrett, B. M., Anthony, J., & Jackson, C. (2018). Using mobile augmented reality to enhance health professional practice education. Current Issues in Emerging eLearning, 4(1), 10.
    Gordon, G., Breazeal, C., & Engel, S. (2015, March). Can children catch curiosity from a social robot? In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction (pp. 91-98).
    Grudin, J. (2017). From tool to partner: The evolution of human-computer interaction. Synthesis Lectures on Human-Centered Interaction, 10(1), i-183.
    Hochschild, A. R. (2010). The Managed Heart: Commercialization of Human Feeling. The Production of Reality: Essays and Readings on Social Interaction, 320À336.
    Huynh, B., Orlosky, J., & Höllerer, T. (2019, March). In-Situ Labeling for Augmented Reality Language Learning. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) (pp. 1606-1611). IEEE.
    Hwang, K. A., & Yang, C. H. (2009). Learner attending auto-monitor in distance learning using image recognition and Bayesian Networks. Expert Systems with Applications, 36(9), 11461-11469.
    Jeung, D. Y., Kim, C., & Chang, S. J. (2018). Emotional labor and burnout: A review of the literature. Yonsei medical journal, 59(2), 187.
    Jung, M., DiFranzo, D., Stoll, B., Shen, S., Lawrence, A., & Claure, H. (2020) Robot Assisted Tower Construction-A Method to Study the Impact of a Robot? Allocation Behavior. ACM Transactions on Human-Robot Interaction (THRI).
    Kamada, H., Ishikawa, T., & Yoshikawa, K. (2017). A Proposal of Color Image Processing Applications for Education. In Colorimetry and Image Processing. IntechOpen.
    Kanda, T., Sato, R., Saiwaki, N., & Ishiguro, H. (2004). Friendly social robot that understands human's friendly relationships. In 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No. 04CH37566) (Vol. 3, pp. 2215-2222). IEEE.
    Kanda, T., Shimada, M., and Koizumi, S. (2012) Children learning with a social robot. ACM/IEEE Int. Conf. on Human Robot Interaction (HRI 2012).
    Kory-Westlund, J. M., & Breazeal, C. (2019). A long-term study of young children's rapport, social emulation, and language learning with a peer-like robot playmate in preschool. Frontiers in Robotics and AI, 6, 81.
    Kuo, H. C., Tseng, Y. C., & Yang, Y. T. C. (2019). Promoting college student’s learning motivation and creativity through a STEM interdisciplinary PBL human-computer interaction system design and development course. Thinking Skills and Creativity, 31, 1-10.
    Kuo, J. H., Huang, C. M., Liao, W. H., & Huang, C. C. (2011, September). HuayuNavi: a mobile Chinese learning application based on intelligent character recognition. In International Conference on Technologies for E-Learning and Digital Entertainment (pp. 346-354). Springer, Berlin, Heidelberg.
    Lee, K. (2012). Augmented reality in education and training. TechTrends, 56(2), 13-21.
    Matsumoto, D., & Hwang, H. C. (2018). Microexpressions differentiate truths from lies about future malicious intent. Frontiers in psychology, 9, 2545.
    Matsutomo, S., Miyauchi, T., Noguchi, S., & Yamashita, H. (2012). Real-time visualization system of magnetic field utilizing augmented reality technology for education. IEEE transactions on magnetics, 48(2), 531-534
    Pan, M. K., Choi, S., Kennedy, J., McIntosh, K., Zamora, D. C., Niemeyer, G., ... & Christensen, D. (2020, October). Realistic and Interactive Robot Gaze. In 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (pp. 11072-11078). IEEE.
    Park, C., Ryu, J., Kang, S., Kim, J., Sohn, J., & Cho, H. (2007, October). The emotion expression robot through the affective interaction: KOBIE. In Proceedings of the 1st international conference on Robot communication and coordination (pp. 1-4).
    Qiu, Y., Xu, C., Su, M., Chen, H., Guan, Y., & Zhu, H. (2019). Mechanical Design and Kinematic Control of a Humanoid Robot Face. In International Conference on Intelligent Robotics and Applications (pp. 25-38). Springer, Cham.
    Rincon, J. A., Costa, A., Novais, P., Julian, V., & Carrascosa, C. (2019). A new emotional robot assistant that facilitates human interaction and persuasion. Knowledge and Information Systems, 60(1), 363-383.
    Risal, M. F., Sukaridhoto, S., & Rante, H. (2019, September). Web Explainer for Children’s Education with Image Recognition Based on Deep Learning. In 2019 International Electronics Symposium (IES) (pp. 406-410). IEEE.
    Rosenberg, E. L., & Ekman, P. (Eds.). (2020). What the face reveals: Basic and applied studies of spontaneous expression using the facial action coding system (FACS). Oxford University Press.
    Rosenthal-von der Pütten, A. M., Krämer, N. C., Maderwald, S., Brand, M., & Grabenhorst, F. (2019). Neural mechanisms for accepting and rejecting artificial social partners in the uncanny valley. Journal of Neuroscience, 39(33), 6555-6570.
    Sackeim, H. A., Greenberg, M. S., Weiman, A. L., Gur, R. C., Hungerbuhler, J. P., & Geschwind, N. (1982). Hemispheric asymmetry in the expression of positive and negative emotions: Neurologic evidence. Archives of neurology, 39(4), 210-218.
    Shiomi, M., Kanda, T., Howley, I., Hayashi, K., & Hagita, N. (2015). Can a social robot stimulate science curiosity in classrooms? International Journal of Social Robotics, 7(5), 641-652.
    Song, S., & Yamada, S. (2016, October). Investigation on Effects of Color, Sound, and Vibration on Human’s Emotional Perception. In Proceedings of the Fourth International Conference on Human Agent Interaction (pp. 225-227).
    Todo, T. (2018). SEER: Simulative emotional expression robot. In ACM SIGGRAPH 2018 Emerging Technologies (pp. 1-2).
    Vallverdú, J. (Ed.). (2009). Handbook of Research on Synthetic Emotions and Sociable Robotics: New Applications in Affective Computing and Artificial Intelligence: New Applications in Affective Computing and Artificial Intelligence. IGI Global.
    Wu, Q., Wang, S., Cao, J., He, B., Yu, C., & Zheng, J. (2019). Object recognition-based second language learning educational robot system for Chinese preschool children. IEEE Access, 7, 7301-7312.
    Yan, W. J., Wu, Q., Liang, J., Chen, Y. H., & Fu, X. (2013). How fast are the leaked facial expressions: The duration of micro-expressions. Journal of Nonverbal Behavior, 37(4), 217-230.
    Yao, L., Dai, Q., Guo, T., Wu, Q., Yang, J., Takahashi, S., ... & Wu, J. (2019, August). A basic study on relationship between facial expression and cuteness for human-robot emotional communication. In 2019 IEEE International Conference on Mechatronics and Automation (ICMA) (pp. 892-897). IEEE.
    Zhang, X., Chen, L., Zhong, Z., Sui, H., & Shen, X. (2017, October). The Effects of the Micro-Expression Training on Empathy in Patients with Schizophrenia. In International Conference on Man-Machine-Environment System Engineering (pp. 189-194). Springer, Singapore.
    Zhang, Y., Song, W., Tan, Z., Zhu, H., Wang, Y., Lam, C. M., ... & Chen, J. (2019). Could social robots facilitate children with autism spectrum disorders in learning distrust and deception?. Computers in Human Behavior, 98, 140-149.

    QR CODE
    :::