跳到主要內容

簡易檢索 / 詳目顯示

研究生: 柯懿修
YI-HSIU KO
論文名稱: 類深度決策樹對不完整資料預測之比較與研究
指導教授: 蔡志豐
Chih-Fong Tsai
口試委員:
學位類別: 碩士
Master
系所名稱: 管理學院 - 資訊管理學系
Department of Information Management
論文出版年: 2019
畢業學年度: 107
語文別: 中文
論文頁數: 103
中文關鍵詞: 資料探勘資料前處理遺漏值機器學習深度學習
外文關鍵詞: Data Mining, Data Pre-processing, Missing Values, Machine Learning, Deep Learning
相關次數: 點閱:8下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 隨著網路的進步,產生的資料量越來越多,如何有效的運用資料就變的很重要,這也讓資料探勘的技術更加精進成熟,然而遺漏值的問題一直存在於資料中很難避免,因此學者們利用統計方法、機器學習方法,做了許多填補遺漏值的研究,期許降低遺漏值對預測的影響,但在直接處理法的研究較少。
    本研究因此提出一個基於深度學習中滑動視窗與邊界框概念的決策樹直接處理法:類深度決策樹(Deep Learning Oriented Decision Tree),依照不同的視窗大小而切割資料集,建立多棵決策樹,最後再進行投票得出預測結果。本研究分成兩個實驗,實驗一主要是類深度決策樹與單一決策樹的比較,實驗二主要是類深度決策樹與其他處理遺漏值方法的比較,實驗一與二中又再分為(A)、(B)兩小實驗,探討測試資料集是否遺漏的差異。於實驗後的結果得知,在19維以上資料使用類深度決策樹直接處理不完整資料得到的分類正確率結果最好。相信這樣的貢獻能協助未來研究者能更恰當且有效率的處理遺漏值問題,能夠產生表現更佳的預測模型。


    The advancement of network makes the amount of data produced increasing rapidly. How to use the data effectively becomes very important, which makes the data mining technology more sophisticated. However, the problem of missing values has always been difficult to avoid in the collected data. Therefore, scholars have used statistical and machine learning methods to do a lot of researches for the imputation of missing values, and hope to reduce the impact of missing values on predictions, but there are few studies focusing on another type of solution by directly handling the datasets with missing values.
    Therefore, this thesis proposes a novel approach based on the concept of sliding window and bounding box in deep learning, namely “Deep Learning Oriented Decision Tree”. In this approach, the dataset is divided into several subsets according to different window sizes, and each subset is used to build a decision tree, resulting in decision tree ensembles, and the final prediction result is based on the voting method. There are two experimental studies in this thesis. Study 1 is based on a comparison between Deep Learning Oriented Decision Tree and a single decision tree, and Study 2 for a comparison between Deep Learning Oriented Decision Tree and other missing value imputation methods. Moreover, the testing data with missing values are also considered in the two studies. According to the results of the experiment, the proposed approach performs the best in terms of classification accuracy over higher dimensional datasets. It is believed that such a contribution can help future researchers to deal with missing value problems more appropriately and efficiently, and to produce better performing prediction models.

    摘要 i Abstract ii 目錄 iii 圖目錄 v 表目錄 viii 一、緒論 1 1-1研究背景 1 1-2研究動機 2 1-3研究目的 4 1-4研究架構 5 二、文獻探討 7 2-1遺漏值介紹 7 2-1-1完全隨機遺漏 7 2-1-2隨機遺漏 8 2-1-3非隨機遺漏 9 2-2遺漏值處理 10 2-2-1刪除法 10 2-2-2直接處理法 11 2-2-3補值法 12 2-3深度學習介紹 15 三、研究方法 19 3-1實驗架構 19 3-1-1實驗準備 19 3-2實驗一(A) 20 3-2-1類深度決策樹 21 3-2-2類深度決策樹滑動步長 24 3-3實驗一(B) 25 3-4實驗二(A) 26 3-5實驗二(B) 27 四、研究結果 28 4-1分類正確率 28 4-2實驗結果 28 4-2-1實驗一(A)結果 28 4-2-2實驗一(B)結果 41 4-2-3實驗二(A)結果 53 4-2-4實驗二(B)結果 65 4-3實驗小結 77 五、結論與未來研究方向 83 5-1結論與貢獻 83 5-2未來展望 83 參考文獻 85 附錄一 89

    [1] Hand, D. J. (2006). Data Mining. Encyclopedia of Environmetrics, 2.
    [2] Fayyad, U., Piatetsky-Shapiro, G., & Smyth, P. (1996). From data mining to knowledge discovery in databases. AI magazine, 17(3), 37.
    [3] Fayyad, U., Piatetsky-Shapiro, G., & Smyth, P. (1996). The KDD process for extracting useful knowledge from volumes of data. Communications of the ACM, 39(11), 27-35.
    [4] Gaber, M. M., Zaslavsky, A., & Krishnaswamy, S. (2005). Mining data streams: a review. ACM Sigmod Record, 34(2), 18-26.
    [5] Kotsiantis, S. B., Kanellopoulos, D., & Pintelas, P. E. (2006). Data preprocessing for supervised leaning. International Journal of Computer Science, 1(2), 111-117.
    [6] Han, J., Pei, J., & Kamber, M. (2011). Data mining: concepts and techniques. Elsevier.
    [7] Little, R. J., & Rubin, D. B. (2014). Statistical analysis with missing data (Vol. 333). John Wiley & Sons.
    [8] Friedl, M. A., & Brodley, C. E. (1997). Decision tree classification of land cover from remotely sensed data. Remote sensing of environment, 61(3), 399-409.
    [9] Troyanskaya, O., Cantor, M., Sherlock, G., Brown, P., Hastie, T., Tibshirani, R., ... & Altman, R. B. (2001). Missing value estimation methods for DNA microarrays. Bioinformatics, 17(6), 520-525.
    [10] García-Laencina, P. J., Sancho-Gómez, J. L., & Figueiras-Vidal, A. R. (2010). Pattern classification with missing data: a review. Neural Computing and Applications, 19(2), 263-282.
    [11] Safavian, S. R., & Landgrebe, D. (1991). A survey of decision tree classifier methodology. IEEE transactions on systems, man, and cybernetics, 21(3), 660-674.
    [12] Farhangfar, A., Kurgan, L., & Dy, J. (2008). Impact of imputation of missing values on classification error for discrete data. Pattern Recognition, 41(12), 3692-3705.
    [13] Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems (pp. 1097-1105).
    [14] Little, R. J. (1988). A test of missing completely at random for multivariate data with missing values. Journal of the American statistical Association, 83(404), 1198-1202.
    [15] Rubin, D. B. (1976). Inference and missing data. Biometrika, 63(3), 581-592.
    [16] Scheffer, J. (2002). Dealing with missing data.
    [17] Schafer, J. L., & Olsen, M. K. (1998). Multiple imputation for multivariate missing-data problems: A data analyst's perspective. Multivariate behavioral research, 33(4), 545-571.
    [18] Safavian, S. R., & Landgrebe, D. (1991). A survey of decision tree classifier methodology. IEEE transactions on systems, man, and cybernetics, 21(3), 660-674.
    [19] Jin, C., De-Lin, L., & Fen-Xiang, M. (2009, July). An improved ID3 decision tree algorithm. In 2009 4th International Conference on Computer Science & Education (pp. 127-130). IEEE.
    [20] Steinberg, D., & Colla, P. (2009). CART: classification and regression trees. The top ten algorithms in data mining, 9, 179.
    [21] Schlomer, G. L., Bauman, S., & Card, N. A. (2010). Best practices for missing data management in counseling psychology. Journal of Counseling psychology, 57(1), 1.
    [22] Keller, J. M., Gray, M. R., & Givens, J. A. (1985). A fuzzy k-nearest neighbor algorithm. IEEE transactions on systems, man, and cybernetics, (4), 580-585.
    [23] Hastie, T., & Tibshirani, R. (1996). Discriminant adaptive nearest neighbor classification and regression. In Advances in Neural Information Processing Systems (pp. 409-415).
    [24] Scholkopf, B., & Smola, A. J. (2001). Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT press.
    [25] Chang, C. C., & Lin, C. J. (2011). LIBSVM: a library for support vector machines. ACM transactions on intelligent systems and technology (TIST), 2(3), 27.
    [26] Stekhoven, D. J., & Bühlmann, P. (2011). MissForest—non-parametric missing value imputation for mixed-type data. Bioinformatics, 28(1), 112-118.
    [27] LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. nature, 521(7553), 436.
    [28] Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1985). Learning internal representations by error propagation (No. ICS-8506). California Univ San Diego La Jolla Inst for Cognitive Science.
    [29] Gardner, M. W., & Dorling, S. R. (1998). Artificial neural networks (the multilayer perceptron)—a review of applications in the atmospheric sciences. Atmospheric environment, 32(14-15), 2627-2636.
    [30] Goodfellow, I., Bengio, Y., & Courville, A. (2016). Deep learning. MIT press.
    [31] Krizhevsky, A., Sutskever, I., & Hinton, G. E. (2012). Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems (pp. 1097-1105).
    [32] Kalchbrenner, N., Grefenstette, E., & Blunsom, P. (2014). A convolutional neural network for modelling sentences. arXiv preprint arXiv:1404.2188.
    [33] Zeiler, M. D., & Fergus, R. (2014, September). Visualizing and understanding convolutional networks. In European conference on computer vision (pp. 818-833). Springer, Cham.
    [34] Nagi, J., Ducatelle, F., Di Caro, G. A., Cireşan, D., Meier, U., Giusti, A., ... & Gambardella, L. M. (2011, November). Max-pooling convolutional neural networks for vision-based hand gesture recognition. In 2011 IEEE International Conference on Signal and Image Processing Applications (ICSIPA) (pp. 342-347). IEEE.

    QR CODE
    :::