| 研究生: |
廖健宏 Jian-Hong Liao |
|---|---|
| 論文名稱: |
基於學生程式編輯紀錄應用儀表板衡量程式編寫成效 Applying dashboard diagnostics for measuring students’ coding performance based on coding logs |
| 指導教授: |
楊鎮華
Stephen J.H. Yang |
| 口試委員: | |
| 學位類別: |
碩士 Master |
| 系所名稱: |
資訊電機學院 - 資訊工程學系 Department of Computer Science & Information Engineering |
| 論文出版年: | 2019 |
| 畢業學年度: | 107 |
| 語文別: | 中文 |
| 論文頁數: | 43 |
| 中文關鍵詞: | Python 、程式設計 、儀表板 、資料視覺化 |
| 外文關鍵詞: | Python, Programming, Dashboard, Data visualization |
| 相關次數: | 點閱:14 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
使用儀表板呈現資料視覺化的結果可以幫助老師解釋學生在課堂上的學習行為,並決定適當的干預機會。然而如果我們計畫在程式設計課程應用儀表板,首先我們必須要克服視覺化大量行為特徵的問題,簡化儀表板的內容,並為教師提供有效的解釋方式,因此本研究的主旨在為提供學習行為儀表板給程式設計課程,並通過解釋結果向老師提供適當干預機會,我們在171名大學一年級生的Python程式設計課程中驗證了我們的儀表板,並總結了學生在編寫程式課程時經常遇到的困難類型。
Data visualization is usually defined as the last step of knowledge discovery. In the framework of learning analytics, using the dashboard to present the results of data visualization can support teachers to interpret students' learning behavior in the classroom and make the decision to appropriate intervention opportunities. However, if we plan to apply the dashboard in the programming course, we first need to overcome the problem of visualizing a large number of behavioral features, simplify the content of the dashboard, and provide an effective way for teachers to interpret. Therefore, this study aims to provide a learning behavior dashboard for a program language course, and with the interpretation results suggest teachers appropriate intervention opportunities. We validated our dashboard in a Python programming course with 171 first-year university students and summarized students often encounter problems when writing programs.
國家教育研究院. (2018). 十二年國民基本教育課程綱要國民中學暨普通型高級中等學校-科技領域.
ACARA. (2013). Draft Australian curriculum technologies. Retrieved from http://docs.acara.edu.au/resources/Draft_Australian_Curriculum_Technologies_-_Consultation_Report_-_August_2013.pdf
CSTA, K. (2011). Computer science standards. Computer Science Teachers Association.
Dernoncourt, F., Taylor, C., O’Reilly, U.-M., Veeramachaneni, K., Wu, S., Do, C., & Halawa, S. (2013). MoocViz: A large scale, open access, collaborative, data analytics platform for MOOCs. Paper presented at the NIPS workshop on data-driven education, Lake Tahoe, Nevada. Retrieved from http://groups. csail. mit. edu/EVO-DesignOpt/groupWebSite/uploads/Site/MoocViz. pdf.
Govaerts, S., Verbert, K., Duval, E., & Pardo, A. (2012). The student activity meter for awareness and self-reflection. Paper presented at the CHI'12 Extended Abstracts on Human Factors in Computing Systems.
Grover, S., & Pea, R. J. E. r. (2013). Computational thinking in K–12: A review of the state of the field. 42(1), 38-43.
Jackson, J., Cobb, M., & Carver, C. (2005). Identifying top Java errors for novice programmers. Paper presented at the Proceedings Frontiers in Education 35th Annual Conference.
Kuzilek, J., Hlosta, M., Herrmannova, D., Zdrahal, Z., & Wolff, A. J. L. A. R. (2015). OU Analyse: analysing at-risk students at The Open University. 1-16.
Lu, O. H., Huang, A. Y., Huang, J. C., Lin, A. J., Ogata, H., Yang, S. J. J. J. o. E. T., & Society. (2018). Applying Learning Analytics for the Early Prediction of Students' Academic Performance in Blended Learning. 21(2), 220-232.
Moreno-León, J., & Robles, G. (2015). Analyze your Scratch projects with Dr. Scratch and assess your computational thinking skills. Paper presented at the Scratch conference.
Morimoto, Y., Kurasawa, K., Yokoyama, S., Ueno, M., & Miyadera, Y. (2006). A support system for teaching computer programming based on the analysis of compilation errors. Paper presented at the Sixth IEEE International Conference on Advanced Learning Technologies (ICALT'06).
Romero, C., López, M.-I., Luna, J.-M., Ventura, S. J. C., & Education. (2013). Predicting students' final performance from participation in on-line discussion forums. 68, 458-472.
Shi, C., Fu, S., Chen, Q., & Qu, H. (2015). Vismooc: Visualizing video clickstream data from massive open online courses. Paper presented at the 2015 IEEE Pacific visualization symposium (PacificVis).
Techapalokul, P., & Tilevich, E. (2017). Novice Programmers and Software Quality: Trends and Implications. Paper presented at the 2017 IEEE 30th Conference on Software Engineering Education and Training (CSEE&T).
UK DFE (2013). National curriculum in England: computing programmes of study. Retrieved from https://www.gov.uk/government/publications/national-curriculum-in-england-computing-programmes-of-study/national-curriculum-in-england-computing-programmes-of-study
Vee, M., Meyer, B., & Mannock, K. L. (2006). Understanding novice errors and error paths in object-oriented programming through log analysis. Paper presented at the Proceedings of workshop on educational data mining at the 8th international conference on intelligent tutoring systems (ITS 2006).
Werner, L., Denner, J., Campe, S., & Kawamoto, D. C. (2012). The fairy performance assessment: measuring computational thinking in middle school. Paper presented at the Proceedings of the 43rd ACM technical symposium on Computer Science Education.
Wing, J. M. J. C. o. t. A. (2006). Computational thinking. 49(3), 33-35.
Yin, C., Yamada, M., Oi, M., Shimada, A., Okubo, F., Kojima, K., & Ogata, H. J. I. J. o. H. C. I. (2019). Exploring the Relationships between Reading Behavior Patterns and Learning Outcomes Based on Log Data from E-Books: A Human Factor Approach. 35(4-5), 313-322.