| 研究生: |
王士瑋 Shih-Wei Wang |
|---|---|
| 論文名稱: |
使用機械手臂輔助椎莖螺釘植入之手術導引系統 |
| 指導教授: |
曾清秀
Ching-Shiow Tseng |
| 口試委員: | |
| 學位類別: |
碩士 Master |
| 系所名稱: |
生醫理工學院 - 生物醫學工程研究所 Graduate Institute of Biomedical Engineering |
| 論文出版年: | 2016 |
| 畢業學年度: | 105 |
| 語文別: | 中文 |
| 論文頁數: | 68 |
| 中文關鍵詞: | C-arm影像 、手術導引 、脊椎手術 、高效率透視N點攝影機姿態估計 、機械手臂 |
| 外文關鍵詞: | C-arm image, surgical navigation, spine surgery, EPnP, robot |
| 相關次數: | 點閱:16 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
椎莖螺釘手術具有高風險、高難度的特性,因此醫師進行手術的過程中,需要不斷拍攝C-arm X光影像確定手術器械的行進路徑不會傷害到病患中樞神經,然而高次數的拍攝X光影像也讓病患與醫護人員有吸收高輻射劑量的危險。C-arm影像輔助脊椎手術導引系統可協助解決鑽孔定位的問題,結合機械手臂提供的高精準度與高穩定性,發展一兼具快速定位與鑽孔導引支撐功能的脊椎手術用機器人輔助導引系統,可降低術中拍攝C-arm X光次數、提高手術安全性並加速手術流程等。
本研究以實驗室既有的C-arm影像輔助椎莖螺釘植入手術用的導引系統為基礎,以高效率透視N點攝影機姿態估計方法(EPnP)取代光學式定位器的定位功能,結合X-board上的特徵點影像辨識及影像中心計算,求得前後(AP)及左右(LA)方向的C-arm影像與X-board的座標轉換關係,並於AP及LA影像上規劃椎莖的鑽孔路徑的進入點與結束點,再利用雙角度攝影空間定位技術計算出鑽孔路徑的空間方向,將該鑽孔路徑方位轉換至機械手臂座標系上,讓機械手臂直接將其導槽定位至鑽孔路徑方向,醫師即可依循該導槽進行椎莖鑽孔。
實驗以兩支光學式導引器械之尖點做為真實導引路徑之進入點與結束點,用以驗證導引系統之定位誤差。九次實驗結果顯示進入點的位置誤差為0.92±0.31 mm,結束點的位置誤差為1.16±0.29mm,方向誤差1.25±0.19度。
Pedicle screw insertion is a high risk operation. During operation , the surgeon has to take a lot of X-Ray images to check if the pedicle screw is on the right path. Image assisted surgical navigation system for spine surgery can provide positioning assistance for drilling . The robot also provides high positioning precision and stable supporting for pedicle screw insertion. The navigation system only needs two images to locate the target, which can reduce the time to take X-Ray images and shorten the operation time.
The robotic navigation system developed in this study is based on C-arm image assisted surgical navigation system for spine surgery developed in our laboratory. Efficient perspective-n-point camera pose estimation (EPnP) method is applied to replace the optical tracker to estimate 3D pose of the X-board which feature points. The transformation matrices among the AP/LA view image frames and X-board frame can be determined by EPnP method and trigonometry. Then the drill path defined by the entry and end points selected on both AP and LA images can be determined by applying bi-plane method. The direction of the drill path is transformed to the robot frame and the robot automatically moves drill the guide to be coaxial with the direction of the planned drill path. Then the surgeon can safely drill the pedicle along the drill guide.
In the experiment, two positioning tools of the optical tracker are used to pinpoint the entry point and end point of the drill path and so their connection represents the direction of drill path. Several experimental results show that the positioning errors of the entry and end points of the robot are 0.92±0.31 mm and 1.16±0.29mm respectively, and the direction error is 1.25±0.19°.
J. C. Eyke, J. E. Ricciardi, W. Roesch, et al., “Computer-assisted virtual fluoroscopy”, University of Pennsylvania Orthopaedic Journal, Vol. 15, pp. 53–59, 2002.
[2] K. D. Kim, J. P. Johnson, J. D. Babbitz, “Image-guided thoracic pedicle screw placement:a technical study in cadavers and preliminary clinical experience”, Neurosurgical Focus, Vol. 10, pp. 1-5, 2001
[3] I. D. Gelalis, N. K. Paschos, E. E. Pakos, et al., “Accuracy of pedicle screw placement: a systematic review of prospective in vivo studies comparing free hand, fluoroscopy guidance and navigation techniques”, European Spine Journal, Vol. 21, pp.247–255, 2011.
[4] Y. Liu, W. Tian, B. Liu, et al., “Comparison of the clinical accuracy of cervical (C2–C7) pedicle screw insertion assisted by fluoroscopy, computed tomography- based navigation, and intraoperative three-dimensional C-arm navigation”, Chinese Medical Journal, Vol. 123, pp. 2995-2998, 2010.
[5] B. J. Shin, A. R. James, I. U. Njoku, et al., “Pedicle screw navigation: a systematic review and meta-analysis of perforation risk for computer-navigated versus freehand insertion”, Journal of Neurosurgery: Spine, Vol. 17, pp. 113-122, 2012.
[6] T. T. Kim, D. Drazin, F. Shweikeh, et al., “Clinical and radiographic outcomes of minimally invasive percutaneous pedicle screw placement with intraoperative CT (O-arm) image guidance navigation”, Neurosurgical Focus, Vol. 36, pp. E1, 2014.
[7] P. Barsa, P. Suchomel, “Portable CT scanner-based navigation in lumbar pedicle screw insertion”, European Spine Journal, Vol. 22, pp. 1446–1450, 2013.
[8] H. J. Marcus, T. P. Cundy, D. Nandi, et al., “Robot-assisted and fluoroscopy-guided pedicle screw placement : a systematic review”, European Spine Journal, Vol. 23, pp. 291-297, 2014.
[9] X. Hu and I. H. Lieberman, “Robotic-Assisted Spine Surgery”, Minimally Invasive Spine Surgery, Springer Science, New York, pp. 61-66, doi:10.1007/978-1-4614-5674-2_7, 2014.
[10] O. N. Dreval, I. P. Rynkov, K. A. Kasparova, et al., “Result of using spine assist mazor in surgical treatment of spine disorders”, Problem Of Neurosurgery Named After N. N. Burdenko, Vol. 3, pp. 13-18. 2014.
[11] Mazor Robotics. http://www.mazorrobotics.com/
[12] N. Lonjon, E. Chan-Seng, V. Costalat, et al., “Robot-assisted spine surgery : feasibility study through a prospective case-matched analysis”, European Spine Journal, pp, Springer Science, 2015.
[13] http://www.medteceurope.com/
[14] R. Fahrig, M. Moreau, D. W. Holdsworth, “Three-dimensional computed tomographic reconstruction using a C-arm mounted XRII:correction of image intensifier distortion”, Medical Physics, Vol. 24, pp. 1097-1106, 1997.
[15] 吳吉春,「基於C-arm影像的手術導引定位」,國立中央大學機械工程學系,2012碩士論文。
[16] D. F. DeMenthon, L. S. Davis, “Model-based object pose in 25 lines of code”, International Journal of Computer Vision, Vol. 15, pp. 123-141, 1995.
[17] D. Oberkampf, D. F. DeMenthon, L. S. Davis, “Iterative pose estimation using coplanar feature points”, Computer Vision and Image Understanding, Vol. 63, pp. 495-511, 1996.
[18] A. Ansar, K. Daniilidis, “Linear pose estimation from points or lines”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 25, pp. 578-589, 2003.
[19] H. Hatze, “High-precision three-dimensional photogrammetric calibration and object space reconstruction using a modified DLT-approach”, Journal of Biomechanics, Vol. 21, pp. 533-538, 1988.
[20] V. Lepetit, F. Moreno-Noguer, P. Fua, “EPnP: An accurate O(n) solution to the PnP problem”, International Journal of Computer Vision, Vol. 81, pp. 155-166, 2009.
[21] D. Grest, T. Petersen, V. Krüger, “A comparison of iterative 2D-3D pose estimation methods for real-time applications”, Image Analysis, Vol. 5575, pp. 706-715, 2009.
[22] http://www.medtronic.com/
[23] https://www.brainlab.com/
[24] 陳冠君,「整合EPnP及導引器械之C-arm影像輔助脊椎手術用導引系統」,國立中央大學機械工程學系,2015
[25] 徐偉恩,「C-arm影像輔助手術導引系統」,國立中央大學機械工程學系,2013
[26] W. J. Wolfe, D. Mathis, C. W. Sklair, et al., “The perspective view of three points”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 13, pp. 66-73, 1991.