| 研究生: |
陳忠謙 Chung-Chien Chen |
|---|---|
| 論文名稱: | A study of some numerical solution algorithms for least-squares problems |
| 指導教授: |
黃楓南
Feng-Nan Hwang |
| 口試委員: | |
| 學位類別: |
碩士 Master |
| 系所名稱: |
理學院 - 數學系 Department of Mathematics |
| 論文出版年: | 2021 |
| 畢業學年度: | 109 |
| 語文別: | 中文 |
| 論文頁數: | 42 |
| 中文關鍵詞: | 牛頓法 、分類問題 、參數辨識 、最速下降法 、牛頓法 |
| 外文關鍵詞: | least-squares problem, Newton, Gauss-Newton, steepest descent, NGMRES |
| 相關次數: | 點閱:22 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
模型擬合是科學計算的一部分,它可以幫助我們找出所研究的系統具有哪些屬性。這些系統可能是工程系統或是最近在機器學習中越來越常使用的概念系統。最小二乘法是一個經典且廣為人知的技術來幫助我們建立模型擬合到我們所得到的數據。在本文中,我們使用了幾種算法來研究如何解決最小二乘問題,例如最速下降法、有限內存 Broyden-Fletcher-Goldfarb-Shanno(LBFGS)、Gauss-Newton 方法(GN)、Levenberg-Marquardt 方法(LM) 和非線性廣義極小殘差(N-GMRES)與最速下降預處理。對於機器學習問題中的二元分類,我們有線性和非線性最小二乘分類器兩種模型,通過訓練精度來比較性能。
根據我們在求解測試問題、識別彈簧系統參數、鳶尾花和MNIST數據集的二元分類中獲得的數值結果。我們對如何解決最小二乘問題進行了具體討論。並且非線性最小二乘分類器模型在分類問題上有更好的表現。
Data fitting is part of scientific computing which can help us find out what properties the investigated system has. These systems might be engineering systems or conceptual systems that are used in machine learning more frequently. Least squares is the classic and widely known technique for fitting model to data.
In this thesis, we use several algorithms to study how we solve the least square problem, such as Steepest descent method, Limited-memory Broyden–Fletcher–Goldfarb–Shanno (LBFGS), Gauss-Newton method (GN), Levenberg–Marquardt method (LM) and nonlinear generalized minimal
residual (N-GMRES) with steepest descent preconditioning. And for binary classification in machine learning problems, we have linear and nonlinear least squares classifier two models to compare the performance by the training accuracy.
With the numerical result we obtain in solving test problems, parameter identification of spring-mass system, and binary classification of Iris flowers and MNIST data set. We have a specific discussion on how we solve the least-squares problems. And the nonlinear least squares classifier model have the better performance in classification problem.
Bibliography
[1] F. Chollet. Deep Learning with Python. Simon and Schuster, 2017.
[2] Wright S. Nocedal, J. Numerical Optimization. Springer Science & Business Media,
2006.
[3] Pereyra V. Scherer G. Hansen, P. C. Least-Squares Data Fitting with Applications.
JHU Press, 2013.
[4] H.D. Sterck. Steepest descent preconditioning for nonlinear gmres optimization.
Numerical Linear Algebra with Applications, 20:453–471, 2013.
[5] C.T. Kelley. Iterative Methods for Optimization. SIAM, 1999.
[6] Vandenberghe L. Boyd, S. Introduction to Applied Linear Algebra: Vectors,
Matrices, and Least-Squares. Cambridge university press, 2018.
[7] D.G. Luenberger. Introduction to Linear and Nonlinear Programming, volume 28.
Addison-wesley Reading, MA, 1973.
[8] R.M. Freund. The steepest descent algorithm for unconstrained optimization and
a bisection line-search method. Journal of Massachusetts Institute of Technology.
United States of america, 2004.
[9] H.D. Sterck. A nonlinear gmres optimization algorithm for canonical tensor decomposition.
SIAM Journal on Scientific Computing, 34:A1351–A1379, 2012.
[10] Oosterlee C. W. Washio, T. Krylov subspace acceleration for nonlinear multigrid
schemes. Electronic Transactions on Numerical Analysis, 6:3–1, 1997.