| 研究生: |
郭彥鋒 Yan-Fong Kuo |
|---|---|
| 論文名稱: |
不同結構學習演算法之類神經分類器之比較 Comparisons of Neural Network Classifiers Based on Learning Algorithms with Different Structures |
| 指導教授: |
鍾鴻源
Hung-Yuan Chung |
| 口試委員: | |
| 學位類別: |
碩士 Master |
| 系所名稱: |
資訊電機學院 - 電機工程學系 Department of Electrical Engineering |
| 畢業學年度: | 98 |
| 語文別: | 中文 |
| 論文頁數: | 65 |
| 中文關鍵詞: | 倒傳遞類神經網路 、動態模糊類神經網路 、刪減技巧 |
| 外文關鍵詞: | back propagation neural network, dynamic fuzzy neural network, pruning technique |
| 相關次數: | 點閱:11 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本篇論文旨在探討與評估類神經網路分類器,且探討並分析。靜態類神經網路和動態類神經網路兩種不同形式分類器。此兩種分類器本質上是屬於不同的結構和演算形式。靜態類神經網路在結構上,為一固定結構,即其網路神經元數是靠經驗法則、人工給定的,而動態模糊類神經網路在結構上為一種動態調整,其網路神經元數是靠一系列的學習規則所衍生的。其中倒傳遞網路,主要對學習演算法採用Levenberg-Marquardt 方法改善演算的收斂速度。其次,動態類神經網路中探討部分,分為兩部分:架構的學習和參數的學習,在其架構上用刪減技巧使結構更精簡、更容易去實現。最後在實驗結果方面,利用UCI 樣本資料庫進行分類處理,以評估兩種分類器的準確率。
This thesis aims to investigate and evaluate neural network classifiers, especially on back propagation neural network and dynamic fuzzy neuralnetwork. And we further analyze and improve of both classifiers to ensure the high accuracy of internet. In back propagation neural network, we mainly focus on the learning algorithm and adopt the Levenberg-Marquart method to improve the performance. Moreover, the discussion of the dynamic fuzzy neural network could be divided into two parts: structure learning and parameter learning. The optimal parameter learning is the main work in this study. And it is used by the pruning techniques for dynamic fuzzy neural network structure and would lead to an easy operation for internet, structure simplification and facilitating the
accomplishment. Finally, from the experimental results, the classification is made from the UCI database to evaluate the accuracy of both back propagation neural network and dynamic fuzzy neural network classifiers.
[1] F. Rosenblatt, “The perceptron:A probabilistic for information storage and organization in the brain, ” Psychological
Review, vol.65, pp.386-408, 1985.
[2] A.E. Bryson, Y. C. HO, Applied optimal control, New York: Blaisdell, 1969.
[3] G.I .Webb, Multiboosting:A technique for combining boosting and wagging, Mach Learn, pp.159-196, 2000.
[4] P.J. Werbos, Beyond regression: New tools for prediction and analysis in the behavioral sciences, MA:Harvard University, 1974.
[5] D.E. Rumelhart, G.E Hinton, R.J Williams, “ Learning
respresentations of back propagation errors, ”Nature, pp.533-536, 1986.
[6] Z.M. Tan,“A study on Video Servo Control Systems, "Master Thesis, Department of Mechanical and Electro-Mechanical
Engineering, National Sun Yat-Sen University, Taiwan, 2007.
[7] S.I. Gallant,“Neural Network Learning and Expert Systems, "The MIT Press, Massachusetts, 1993.
[8] P.M Murphy, UCI-Benchmark Repository of Artificial and Real Data Sets, http://www.ics.uci.edu/~mlearn, University of Cal
[9] D. Plaut,S.Nowlaw,and D.Hinton, “ Experiment on learning by back-propagation, ”Technical Report CMU-CS-86-126 , Department of Computer Scinece, Carnegic Mellon University,Pittsburgh, PA, 1986.
[10] K.A. Levenberg, method for the solution of certain problem in least squares, Quart.Apple.Math., pp.164-168, 1994.
[11] C.M. Bishop, Neural Networks for pattern Recognition, Oxford University Press, 1995.
[12] B. Hassibi, D.G. Stock, G.J. Wolff, “Optimal brain surgeon and general network purning, ”Proceedings IEEE Conference on Neural Networks, vol. 1 , San Francisco, 1993.
[13] J.Zurada, Introduction to Artificial Neural Networks, West Publishing Company, St.Paul, MN., 1992.
[14] J.S.Jang, C.T. Sun, “Functional Equivalence between Radial Basis Function Networks and Fuzzy Inference Systems, ”IEEE trans.Neural Network, pp.156-158, 1993.
[15] G.C Goodwin, K.S. Sin, “Adaptive Filtering Prediction and Control, ” Englewood Cliffs, NJ:Prenticeh-Hall, 1984.
[16] H. Akaike, “ A New Look at the statistical Model
Indentification , ”IEEE trans.Automat.Contr., pp.716-723, 1974.
[17] D.Angluin,C.smith,“Inductive Infrernce:Theory and Methods,”ACM comput.Surv.,pp.716-723,1984.
[18] J.Moody, C.J. Darken, “Fast Learning in Network of Locally-Tuned Processing Units, Neural Computation ,pp.281-294, 1989.ifornia Irvine.CA, 1995.
[19] S.Q. Wu, M.J. Er, “ Dynamic Fuzzy Neural Networks:A Novel
Approach to Function Approximation, ”IEEE trans.Syst, Man,
Cybern.Part B., pp358-364, 2000.
[20] M.J. Er, S.Q. Wu,“A Fast Learning Algorithm for Parsimonious Fuzzy Neural System, ” Fuzzy sets and Systems, pp.337-351, 2002.
[21] S. Lee, R.M. Kil, “ A Gaussian Potential Function
Network with Hierarchically Self-Organizing Learning, ” Neural Networks , pp.207-224, 1991.
[22] Y. Lu, N. Sundararajan, P.A. Saratchandran, “Sequential Learning Scheme for Function Approximation by Using Minimal Radial Basis Function Networks, ”Neural Computation, pp.461-478, 1997.
[23] T.Hastie, R.Tibshirani, and J.Friendman, The Element of statistical Learning : Data Mining, Inference and Prediction, Springer-Verlag. Berlin Heidelberg New York, pp.214-217, 2001.