跳到主要內容

簡易檢索 / 詳目顯示

研究生: 蕭子胤
Tz-Yin Shiau
論文名稱: 分散共識支持向量機之研究
A Study on Distributed Consensus Support Vector Machine
指導教授: 楊肅煜
Suh-Yuh Yang
口試委員:
學位類別: 碩士
Master
系所名稱: 理學院 - 數學系
Department of Mathematics
論文出版年: 2024
畢業學年度: 112
語文別: 英文
論文頁數: 46
中文關鍵詞: 支持向量機核方法分散共識問題交錯方向乘子法隱私保護
外文關鍵詞: support vector machine, kernel method, distributed consensus problem, alternating direction method of multipliers, privacy preserving
相關次數: 點閱:13下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 支持向量機是一種有效的二元分類器,在各種應用中具有出色的性能。然而,在
    處理大型資料集或來自分散來源的資料時,計算資源的限制和資料隱私問題可能
    會阻礙支持向量機的效能。在本文中,我們研究分散共識支持向量機,它具有兩
    個主要優點:它允許每位工作者在不共享資料的情況下推導出更通用的超平面,
    從而保持資料的隱私;它能夠將大型問題分解為可處理的子問題,透過分散式計
    算提高處理速度。儘管如此,由於分散共識支持向量機的目標函數的不可微性而
    面臨挑戰,我們採用了平滑支持向量機的處理方法來解決這個問題,結合平滑函
    數來增強目標函數的可微性,從而產生了所謂的分散共識平滑支持向量機,它利
    用1-範數進行懲罰計算,優化效率和準確性。最後,我們透過多次數值實驗驗證
    該演算法的效能。


    The support vector machine (SVM) is an effective binary classifier with excellent
    performance across various applications. However, computing limitations and data
    privacy concerns can hinder SVM’s performance when handling large datasets or
    data from distributed sources. In this thesis, we study the distributed consensus
    SVM, which offers two primary advantages: it maintains data privacy by allowing
    each worker to derive a more generalized hyperplane without data sharing, and it
    enables the decomposition of large-scale problems into manageable sub-problems
    for enhanced processing speed through distributed computing. Nonetheless, the
    distributed consensus SVM faces challenges due to the non-differentiability of its
    objective function. We adopt the smoothing SVM approach to address this issue,
    incorporating a smoothing function to enhance function differentiability. It leads
    to the so-called distributed consensus smoothing SVM, which utilizes the 1-norm
    for penalty calculation, optimizing efficiency and accuracy. Finally, we validate the
    performance of this algorithm through several numerical experiments.

    1 Introduction 1 2 SupportVectorMachine 4 2.1 HardMarginLinearSVM . . . . . . . . . . . . . . . . . . . . . . . . 4 2.2 SoftMarginLinearSVM. . . . . . . . . . . . . . . . . . . . . . . . . 6 2.3 SmoothingSVM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 2.4 KernelMethod . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.5 KernelSVM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 3 DistributedConsensusOptimizationProblem 15 3.1 AlternatingDirectionMethodofMultipliers . . . . . . . . . . . . . . 15 3.2 DistributedConsensusOptimizationProblem . . . . . . . . . . . . . 17 4 DistributedConsensusSupportVectorMachine 19 4.1 DistributedConsensusLinearSVM . . . . . . . . . . . . . . . . . . . 19 4.2 DistributedConsensusKernelSVM. . . . . . . . . . . . . . . . . . . 21 4.3 DistributedConsensusReducedKernelSVM. . . . . . . . . . . . . . 22 5 NumericalExperiments 24 5.1 DistributedConsensusLinearSVM . . . . . . . . . . . . . . . . . . . 24 5.2 DistributedConsensusKernelSVM. . . . . . . . . . . . . . . . . . . 27 5.3 DistributedConsensusReducedKernelSVM. . . . . . . . . . . . . . 32 6 Conclusion 34 References 36

    [1] A. Beck, Introduction to Nonlinear Optimization: Theory, Algorithms, and Ap
    plications with Matlab, MOS-SIAM Series on Optimization, SIAM, Philadel
    phia, PA, 2014.
    [2] B. E. Boser, I. M. Guyon, and V. N. Vapnik, A training algorithm for optimal
    margin classifiers, COLT ’92: Proceedings of the Fifth Annual Workshop on
    Computational Learning Theory, July 1992, pp. 144-152.
    [3] S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein, Distributed opti
    mization and statistical learning via the ADMM, Foundations and Trends in
    Machine Learning, 3 (2010), pp. 1-122.
    [4] J. Broniarek, SVM-parallel consensus and sharing optimization, Project Report,
    Signal Processing@Data Science, February 2020.
    [5] D. Calvetti and E. Somersalo, Mathematics of Data Science: A Computational
    Approach to Clustering and Classification, SIAM, Philadelphia, PA, 2021.
    [6] C. Chen and O. L. Mangasarian, A class of smoothing functions for nonlinear
    and mixed complementarity problems, Computational Optimization and Appli
    cations, 5 (1996), pp. 97-138.
    [7] C. Chen and O. L. Mangasarian, Smoothing methods for convex inequalities
    and linear complementarity problems, Mathematical Programming, 71 (1995),
    pp. 51-69.
    [8] H.-H. Chen, Distributed consensus reduced support vector machine, Master
    Thesis, National Chiao Tung University, July 2019.
    [9] L. J. Chien, Y. J. Lee, Z. P. Kao, and C. C. Chang, Robust 1-norm soft mar
    gin smooth support vector machine, In: C. Fyfe et al. (eds.), Intelligent Data
    Engineering and Automated Learning– IDEAL 2010, pp. 145-152, 2010.
    37
    [10] L. R. Jen and Y.-J. Lee, Clustering model selection for reduced support vec
    tor machines, In: Z. R. Yang et al. (eds.), Intelligent Data Engineering and
    Automated Learning– IDEAL 2004, pp. 714-719, 2004.
    [11] Y.-J. Lee and S.-Y. Huang, Reduced support vector machines: A statistical
    theory, IEEE Transactions on Neural Networks, 18 (2007), pp. 1-13.
    [12] Y.-J. Lee and O. L. Mangasarian, SSVM: A smooth support vector machine
    for classification, Computational Optimization and Applications, 20 (2001), pp.
    5-22.
    [13] Y.-J. Lee and O. L. Mangasarian, RSVM: reduced support vector machines,
    Proceedings of the 2001 SIAM International Conference on Data Mining
    (SDM), April 2001.
    [14] J. Li, E. Elhamifar, I.-J. Wang, and R. Vidal, Consensus with robustness to
    outliers via distributed optimization, 49th IEEE Conference on Decision and
    Control (CDC), Atlanta, GA, USA, 2010, pp. 2111-2117.
    [15] X. Liu, Y. Ding, and F. S. Bao, General scaled support vector machines, ICMLC
    2011: Proceedings of the 3rd International Conference on Machine Learning and
    Computing, Piscataway, NJ, 2011.
    [16] H.-Y. Lo, Incremental reduced support vector machines, Master Thesis, Na
    tional Taiwan University of Science and Technology, July 2004.
    [17] O. L. Mangasarian, Generalized support vector machines, Mathematics Pro
    gramming Technical Reports, Department of Computer Sciences, University of
    Wisconsin-Madison, October 1998.
    [18] S. W. Purnami, J. M. Zain, and A. Embong, Reduced support vector machine
    based on k-mode clustering for classification large categorical dataset, In: J. M.
    Zain et al. (eds.), Software Engineering and Computer Systems, ICSECS 2011,
    Part II, CCIS 180, pp. 694-702, 2011.
    38
    [19] F. Rahimi, Support vector machines with convex combination of kernels, Master
    Thesis, Concordia University, Canada, 2018.
    [20] A. Rumpf, Overview of dual ascent, Short Notes, Department of Applied Math
    ematics, Illinois Institute of Technology, Chicago, IL, February 19, 2019.

    QR CODE
    :::