跳到主要內容

簡易檢索 / 詳目顯示

研究生: 林固德
Ku-Te Lin
論文名稱: 針對非均勻光照的 TV-L2 變分光流估計模型
A TV-L2 Variational Model for Optical Flow Estimation under Inhomogeneous Illumination
指導教授: 楊肅煜
Suh-Yuh Yang
口試委員:
學位類別: 碩士
Master
系所名稱: 理學院 - 數學系
Department of Mathematics
論文出版年: 2025
畢業學年度: 113
語文別: 英文
論文頁數: 38
中文關鍵詞: 光流總變差非均勻光照亮度恆定假設
外文關鍵詞: optical flow, total variation, inhomogeneous illumination, brightness constancy assumption
相關次數: 點閱:52下載:0
分享至:
查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報
  • 光流估計是電腦視覺中的一項基礎問題,旨在量化影像序列中相鄰影格之間物體的運動。光流通常以向量場的形式表示,每個向量描述像素從一幀移動至下一幀的位移。傳統方法大多依賴於亮度恆定假設,即假設像素在運動過程中其亮度不變。然而,當光照分佈不均時,該假設往往無法成立,導致估計精度顯著下降。在本文中,我們提出一種新穎的 TV-L2 變分模型,旨在解決非均勻光照條件下的光流估計問題。該模型結合了總變差正則化,用以保留運動邊界,並引入 L2 資料保真項以因應光照變化。我們進一步採用分裂 Bregman 方法有效求解相應的最佳化問題,確保整體方法兼具穩定性與計算效率。透過大量的數值實驗,我們驗證了所提模型在處理複雜光照情境時的有效性,並顯示其在準確度方面優於傳統方法。


    Optical flow estimation is a fundamental problem in computer vision, aiming to quantify the motion of objects between consecutive frames in an image sequence. Optical flow is typically represented as a vector field, where each vector captures the displacement of a pixel from one frame to the next. Traditional approaches often rely on the brightness constancy assumption, which posits that the intensity of a pixel remains unchanged over time as it moves. However, this assumption frequently fails under inhomogeneous illumination conditions, where lighting variations can significantly degrade estimation accuracy. In this thesis, we present a novel TV-L2 variational model designed to address the challenges of optical flow estimation under inhomogeneous illumination. Our model integrates total variation (TV) regularization to preserve motion boundaries and an L2 data fidelity term to handle illumination variations. We propose an iterative split Bregman scheme to effectively solve the associated optimization problem, ensuring robustness and computational efficiency. Through extensive numerical experiments, we demonstrate the effectiveness of our proposed model, highlighting its accuracy in handling complex illumination scenarios compared to traditional methods.

    1 Introduction ................................. 1 2 Optical flow methods ............................ 4 2.1 Brightness constancy assumption .................. 4 2.2 The Lucas-Kanade method and the Horn-Schunck method . . . 6 2.3 Relaxation of the brightness constancy assumption . . . . . . . . 7 2.4 The Gennert-Negahdaripour method ................ 8 3 The proposed optical flow method .................... 11 3.1 Total variation regularization..................... 11 3.2 TV-L2 optical flow method ...................... 13 3.3 Implementation details ........................ 15 4 Numerical experiments........................... 18 4.1 Discussion of the choice of parameters . . . . . . . . . . . . . . . 18 4.2 Comparison with different mask types . . . . . . . . . . . . . . . 20 4.3 Results for other sequences...................... 21 5 Conclusion .................................. 28 References 30

    [1] S. Baker, D. Scharstein, J. P. Lewis, S. Roth, M. J. Black, and R. Szeliski, A database and evaluation methodology for optical flow, International Journal of Computer Vision, 92 (2011), pp. 1-31.
    [2] D. Sun, S. Roth, and M. J.Black, Secrets of optical flow estimation and their principles, In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, (2010), pp. 2432-2439.
    [3] N. Sharmin, and R. Brad, Optimal filter estimation for Lucas-Kanade optical flow, Sensors, 12 (2012), pp. 12694-12709.
    [4] E. Meinhardt-Llopis, and J. Sánchez, Horn-Schunck optical flow with a multi-scale strategy, Image Processing On Line, 3 (2013), pp. 151-172.
    [5] T. Brox, A. Bruhn, N. Papenberg, and J. Weickert, High accuracy optical flow estimation based on a theory for warping, In: Computer Vision-ECCV 2004: 8th European Conference on Computer Vision, Prague, Czech Republic, May 11-14, 2004. Proceedings, Part IV 8, (2004), pp. 25-36.
    [6] D. Fleet, and Y. Weiss, Optical flow estimation, In: Handbook of Mathematical Models in Computer Vision, (2006), pp. 237-257.
    [7] B. D. Lucas, and T. Kanade, An iterative image registration technique with an application to stereo vision, In: IJCAI’81: 7th International Joint Conference on Artificial Intelligence, 2 (1981), pp. 674-679.
    [8] A. Bruhn, J. Weickert, and C. Schnörr, Lucas/Kanade meets Horn/Schunck: Combining local and global optic flow methods, International Journal of Computer Vision, 61 (2005), pp. 211-231.
    [9] B. K. P. Horn, and B. G. Schunck, Determining optical flow, Artificial Intelligence, 17 (1981), pp. 185-203.
    [10] H. W. Haussecker, and D. J. Fleet, Computing optical flow with physical models of brightness variation, IEEE Transactions on Pattern Analysis and Machine Intelligence, 23 (2001), pp. 661-673.
    [11] A. Wedel, T. Pock, C. Zach, H. Bischof, and D. Cremers, An improved algorithm for TV-L1 optical flow, In: Statistical and Geometrical Approaches to Visual Motion Analysis: International Dagstuhl Seminar, Dagstuhl Castle, Germany, July 13-18, 2008. Revised Papers, (2009), pp.23-45.
    [12] C. H. Teng, S. H. Lai, Y. S. Chen, and W. H. Hsu, Accurate optical flow computation under non-uniform brightness variations, Computer Vision and Image Understanding, 97 (2005), pp. 315-346.
    [13] S. Cai, Y. Huang, B. Ye, and C. Xu, Dynamic illumination optical flow computing for sensing multiple mobile robots from a drone, IEEE Transactions on Systems, Man, and Cybernetics: Systems, 48 (2017), pp. 1370-1382.
    [14] S. Ali, C. Daul, E. Galbrun, and W. Blondel, Illumination invariant optical flow using neighborhood descriptors, Computer Vision and Image Understanding, 145 (2016), pp. 95-110.
    [15] J. S. Pérez, N. M. López, and A. S. de la Nuez, Robust optical flow estimation, Image Processing On Line, 3 (2013), pp. 252-270.
    [16] J. Sánchez, N. M. Monzón López, and A. J. Salgado de la Nuez, Parallel implementation of a robust optical flow technique, CTIM Technical Report, 1 (2012).
    [17] D. Sun, X. Yang, M. Y. Liu, and J. Kautz, Pwc-net: Cnns for optical flow using pyramid, warping, and cost volume, In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2018), pp. 8934-8943.
    [18] L. I. Rudin, S. Osher, and E. Fatemi, Nonlinear total variation based noise removal algorithms, Physica D: Nonlinear Phenomena, 60 (1992), pp. 259-268.
    [19] A. Chambolle, V. Caselles, D. Cremers, M. Novaga, and T. Pock, An introduction to total variation for image analysis, <hal-00437581>, 9 (2010).
    [20] T. Goldstein, and S. Osher, The split Bregman method for L1-regularized problems, SIAM Journal on Imaging Sciences, 2 (2009), pp. 323-343.
    [21] P. Getreuer, Rudin-Osher-Fatemi total variation denoising using split Bregman, Image Processing On Line, 2 (2012), pp.74-95.
    [22] M. A. Gennert, and S. Negahdaripour, Relaxing the brightness constancy assumption in computing optical flow, A. I. Memo No. 975, (1987).

    QR CODE
    :::