AbstractObjective To evaluate the feasibility of predicting lung cancer target position by online optical surface motion monitoring. Methods CT images obtained in different ways of stereotactic body radiotherapy (SBRT) plans from 16 lung cancer cases were selected for experimental simulation. The planned CT and the original target position were taken as the reference, and the 10 phases of CT in four dimension CT and each cone beam (CBCT) were taken as the floating objects, on which the floating target location was delineated. The binocular visual surface imaging method was used to obtain point cloud data of reference and floating image body surface, while the point cloud feature information was extracted for comparison. Based on the random forest algorithm, the feature information difference and the corresponding target area position difference were fitted, and an online prediction model of the target area position was constructed. Results The model had a high prediction success rate for the target position. The variance explainded and root mean squared error (RMSE) of left-right, superior-inferior, anterior-posterior directions were 99.76%, 99.25%, 99.58%, and 0.0447 mm, 0.0837 mm, 0.0616 mm, respectively. Conclusion The online monitoring of lung SBRT target position proposed in this study is feasible, which can provide reference for online monitoring and verification of target position and dose evaluation in clinical radiotherapy.
Corresponding Authors:
Deng Yongjin, Email: dengyj27@mail.sysu.edu.cn
Cite this article:
Huang Taiming,Guan Qi,Zhong Jiajian et al. Feasibility study of predicting lung tumor target movement based on body surface motion monitoring[J]. Chinese Journal of Radiation Oncology, 2023, 32(2): 138-144.
Huang Taiming,Guan Qi,Zhong Jiajian et al. Feasibility study of predicting lung tumor target movement based on body surface motion monitoring[J]. Chinese Journal of Radiation Oncology, 2023, 32(2): 138-144.
[1] Sentker T, Schmidt V, Ozga AK, et al.4D CT image artifacts affect local control in SBRT of lung and liver metastases[J]. Radiother Oncol, 2020,148:229-234. DOI: 10.1016/j.radonc.2020.04.006. [2] Itonaga T, Sugahara S, Mikami R, et al.Evaluation of the relationship between the range of radiation-induced lung injury on CT images after IMRT for stage I lung cancer and dosimetric parameters[J]. Ann Med, 2021,53(1):267-273. DOI: 10.1080/07853890.2020.1869297. [3] Liu G, Zhao L, Qin A, et al.Lung stereotactic body radiotherapy (SBRT) using spot-scanning proton arc (SPArc) therapy: a feasibility study[J]. Front Oncol, 2021,11:664455. DOI: 10.3389/fonc.2021.664455. [4] Xu C, Sun J, Zhang W, et al.The safety and efficacy of Cyberknife® for thymic malignancy[J]. Cancer Radiother, 2021,25(2):119-125. DOI: 10.1016/j.canrad.2020.06.026. [5] 于松茂, 周舜, 杜乙, 等. Catalyst HD和皮肤标记线肺癌SBRT摆位精度比较[J].中华放射肿瘤学杂志,2019,28(10):772-775. DOI: 10.3760/cma.j.issn.1004-4221.2019. 10.012. Yu SM, Zhou S, Du Y, et al.Comparison of setup accuracy between Catalyst HD and skin markers in stereotactic body radiotherapy of lung cancer[J].Chin J Radiat Oncol,2019,28(10):772-775. DOI: 10.3760/cma.j.issn.1004- 4221.2019.10.012. [6] Takahashi W, Oshikawa S, Mori S.Real-time markerless tumour tracking with patient-specific deep learning using a personalised data generation strategy: proof of concept by phantom study[J]. Br J Radiol, 2020,93(1109):20190420. DOI: 10.1259/bjr.20190420. [7] Dhont J, Harden SV, Chee L, et al.Image-guided radiotherapy to manage respiratory motion: lung and liver[J]. Clin Oncol (R Coll Radiol), 2020,32(12):792-804. DOI: 10.1016/j.clon.2020.09.008. [8] Ranjbar M, Sabouri P, Repetto C, et al.A novel deformable lung phantom with programably variable external and internal correlation[J]. Med Phys, 2019,46(5):1995-2005. DOI: 10.1002/mp.13507. [9] Stowe H, Ogake S, Sharma S, et al.Improved respiratory motion tracking through a novel fiducial marker placement guidance system during electromagnetic navigational bronchoscopy (ENB)[J]. Radiat Oncol, 2019,14(1):124. DOI: 10.1186/s13014-019-1306-0. [10] Olick-Gibson J, Cai B, Zhou S, et al.Feasibility study of surface motion tracking with millimeter wave technology during radiotherapy[J]. Med Phys, 2020,47(3):1229-1237. DOI: 10.1002/mp.13980. [11] Belmont B, Kessler R, Theyyunni N, et al.Continuous inferior vena cava diameter tracking through an iterative Kanade-Lucas-Tomasi-based algorithm[J]. Ultrasound Med Biol, 2018,44(12):2793-2801. DOI: 10.1016/j.ultrasmedbio.2018.07.022. [12] Yin L, Wang X, Ni Y. Flexible Three-dimensional reconstruction via structured-light-based visual positioning and global optimization[J]. Sensors (Basel), 2019,19(7)DOI: 10.3390/s19071583. [13] Chen L, Tang W, John NW, et al.SLAM-based dense surface reconstruction in monocular minimally invasive surgery and its application to augmented reality[J]. Comput Methods Programs Biomed, 2018,158:135-146. DOI: 10.1016/j.cmpb.2018.02.006. [14] Nayak DR, Dash R, Majhi B.An improved pathological brain detection system based on two-dimensional PCA and evolutionary extreme learning machine[J]. J Med Syst, 2017,42(1):19. DOI: 10.1007/s10916-017-0867-4. [15] Asadi S, Roshan S, Kattan MW.Random forest swarm optimization-based for heart diseases diagnosis[J]. J Biomed Inform, 2021,115:103690. DOI: 10.1016/j.jbi. 2021.103690. [16] Ostmann A, Arbizu PM.Predictive models using random forest regression for distribution patterns of meiofauna in Icelandic waters[J]. Marine Biodiversity, 2018,48:719-735. DOI: 10.1007/s12526-018-0882-9. [17] Zhang L, Huettmann F, Zhang X, et al.The use of classification and regression algorithms using the random forests method with presence-only data to model species' distribution[J]. MethodsX, 2019,6:2281-2292. DOI: 10.1016/j.mex.2019.09.035. [18] Wright MN, König IR.Splitting on categorical predictors in random forests[J]. PeerJ, 2019,7:e6339. DOI: 10.7717/peerj.6339. [19] Huang BF, Boutros PC.The parameter sensitivity of random forests[J]. BMC Bioinformatics, 2016,17(1):331. DOI: 10.1186/s12859-016-1228-x. [20] Yan H, Yin FF, Zhu GP, et al.Adaptive prediction of internal target motion using external marker motion: a technical study[J]. Phys Med Biol, 2006,51(1):31-44. DOI: 10.1088/0031-9155/51/1/003. [21] Shao J, Zhang WM, Mellado N, et al.Automated markerless registration of point clouds from TLS and structured light scanner for heritage documentation[J]. J Cult Heritage, 2019,35:16-24. DOI: 10.1016/j.culher. 2018.07.013. [22] Ma Z, Sun D, Xu H, et al.Optimization of 3D point clouds of oilseed rape plants based on time-of-flight cameras[J]. Sensors (Basel), 2021,21(2):664. DOI: 10.3390/s21020664. [23] Neilson PD, Neilson MD, Bye RT.A riemannian geometry theory of three-dimensional binocular visual perception[J]. Vision (Basel), 2018,2(4):43. DOI: 10.3390/vision2040043. [24] Tini A, Pytko I, Lang S, et al.EP-2113: Clinical implementation of an optical surface monitoring system(OSMS®, Varian) in breast irradiation[J]. Radiother Oncol, 2016,119:S993. DOI:10.1016/S0167-8140(16)33364-3. [25] 孙宗文, 黄晓延, 包勇, 等. 基于四维CT的肺体积及呼吸运动分析[J]. 中华放射肿瘤学杂志, 2008,17(6):437-440. Sun ZW, Huang XY, Bao Y, et al. Four-dimensional CT in the study of lung volume and respiratory movement[J]. Chin J Cancer Prev Treat, 2008,17(6):437-440. [26] Zheng XL, Xiao LH, Fan XM, et al.Free breathing DCE-MRI with motion correction and its values for benign and malignant liver tumor differentiation[J]. Radiol Infect Dis, 2015,2(2):65-71.DOI: 10.1016/j.jrid.2015.07.001. [27] Hegde S, Gangisetty S.PIG-Net: Inception based deep learning architecture for 3D point cloud segmentation[J]. Comput Graphics, 2021,95:13-22. DOI: 10.1016/j.cag.2021.01.004. [28] Chen XT, Li Y, Fan JH, et al.RGAM: a novel network architecture for 3D point cloud semantic segmentation in indoor scenes[J]. Infor Sci, 2021,571:87-103. DOI: 10.1016/j.ins.2021.04.069. [29] Liu W, Wang MJ, Wang J.Comparison of random forest, support vector machine and back propagation neural network for electronic tongue data classification: application to the recognition of orange beverage and Chinese vinegar[J]. Sens Actuators B Chem, 2013,177:970-980. DOI: 10.1016/j.snb.2012.11.071.