AbstractObjective Due to the low contrast between tumors and surrounding tissues in CBCT images, this study was designed to propose an automatic segmentation method for central lung cancer in CBCT images. Methods There are 221 patients with central lung cancer were recruited. Among them, 176 patients underwent CT localization and 45 patients underwent enhanced CT localization. The enhanced CT images were set as the lung window and mediastinal window, and elastic registration was performed with the first CBCT validation images to obtain the paired data set. The CBCT images could be transformed into"enhanced CT" under the lung window and mediastinal window by loading the paired data sets into cycleGAN network for style transformation. Finally, the transformed images were loaded into the UNET-attention network for deep learning of GTV. The results of segmentation were evaluated by Dice similarity coefficient (DSC), Hausdorff distance (HD) and the area under the receiver operating characteristic curve (AUC). Results The contrast between tumors and surrounding tissues was significantly improved after style transformation. The DSC value of cycleGAN+UNET-attention network was 0.78±0.05, HD value was 9.22±3.42 and AUC value was 0.864, respectively. Conclusion The cycleGAN+UNET-attention network can effectively segment central lung cancer in CBCT images.
Chen Jie,Wang Keqiang,Jian Jianbo et al. Research on automatic segmentation of tumor target of lung cancer in CBCT images by multimodal style transfer technology based on deep learning[J]. Chinese Journal of Radiation Oncology, 2022, 31(1): 43-48.
Chen Jie,Wang Keqiang,Jian Jianbo et al. Research on automatic segmentation of tumor target of lung cancer in CBCT images by multimodal style transfer technology based on deep learning[J]. Chinese Journal of Radiation Oncology, 2022, 31(1): 43-48.
[1] TARIQ I, CHEN T, KIRKBY NF, et al. Modelling and Bayesian adaptive prediction of individual patients'tumour volume change during radiotherapy[J]. Phys Med Biol, 2016, 61(5):2145-2161. DOI:10.1088/0031-9155/61/5/ 2145.
[2] JEONG J, OH JH, SONKE JJ, et al. Modeling the cellular response of lung cancer to radiation therapy for a broad range of fractionation schedules[J]. Clin Cancer Res, 2017, 23(18):5469-5479. DOI:10.1158/1078-0432. CCR-16-3277.
[3] CHEN J, WANG KQ, JIAN JB, et al. A mathematical model for predicting the changes of non-small cell lung cancer based on tumor mass during radiotherapy[J]. Phys Med Biol, 2019, 64(23):235006. DOI:10.1088/1361-6560/ ab47c0.
[4] ZHONG J, NING R, CONOVER D. Image denoising based on multiscale singularity detection for cone beam CT breast imaging[J]. IEEE Trans Med Imaging, 2004, 23(6):696-703. DOI:10.1109/tmi.2004.826944.
[5] WANG J, XING L. MO-D-304A-02:accurate noise modeling of cone-beam CT projection data[J]. Med Phys, 2009, 36(6):2697. DOI:10.1118/1.3182231.
[6] RONNEBERGER O, FISCHER P, BROX T. U-net:convolutional networks for biomedical image segmentation[C]. Medical Image Computing and Computer-Assisted Intervention, 2015. DOI:10.1007/978-3-319-24574-4_28.
[7] ALIRR OI. Deep learning and level set approach for liver and tumor segmentation from CT scans[J]. J Appl Clin Med Phys, 2020, 21(10):200-209. DOI:10.1002/acm2.13003.
[8] ZHU JY, PARK T, ISOLA P, et al. Unpaired image-to-image translation using cycle-consistent adversarial networks[C]. Venice:International Conference on Computer Vision,2017. DOI:10.1109/ICCV.2017.244.
[9] OKTAY O, SCHLEMPER J, FOLGOC LL, et al. Attention U-Net:learning where to look for the pancreas[C/OL]. Salt Lake City:Computer Vision and Pattern Recognition, 2018. https://arxiv.org/pdf/1804.03999.pdf.
[10] HINTON GE, SRIVASTAVA N, KRIZHEVSKY A, et al. Improving neural networks by preventing co-adaptation of feature detectors[J]. Comput Sci, 2012, 3:212-223. DOI:10.9774/GLEAF.978-1-909493-38-4_2.
[11] 朱广迎,夏廷毅,王绿化,等. 非小细胞肺癌靶区勾画的共识与争议[J]. 中华放射肿瘤学杂志,2008, 17(6):432-436. DOI:10.3321/j.issn:1004-4221.2008.06.005.
ZHU GY, XIA TY, WANG LH,et al. Consensus and controversies on delineation of radiotherapy target volume for patients with non-small cell lung cancer[J]. Chin J Radiat Oncol, 2008, 17(6):432-436. DOI:10.3321/j.issn:1004-4221.2008.06.005.
[12] DICE LR. Measures of the amount of ecologic association between species[J]. Ecology, 1945, 26(3):297-302. DOI:10.2307/1932409.
[13] HUTTENLOCHER DP, KLANDERMAN GA, RUCKLIDGE WJ. Comparing images using the hausdorff distance[J]. IEEE Trans Patt Anal Intell, 1993, 15(9):850-863. DOI:10.1109/ 34.232073
[14] 肖宁,强彦,赵涓涓,等. 基于分割对抗网络的肺结节分割[J]. 计算机工程与设计, 2019, 40(4):931-936, 1076. DOI:10.16208/j.issn1000-7024.2019.04.005.
XIAO N, QIANG Y, ZHAO JJ,et al. Pulmonary nodules segmentation based on segmentation adversarial network[J]. ComputEngin Desi, 2019, 40(4):931-936,1076. DOI:10.16208/j.issn1000-7024.2019.04.005
[15] 赵飞,刘杰. 基于卷积神经网络和图像显著性的心脏CT图像分割[J]. 北京生物医学工程, 2020, 39(1):48-55. DOI:10.3969/j.issn.1002-3208.2020.01.008.
ZHAO F, LIU J. Cardiac CT image segmentation based on convolutional neural network and image saliency[J]. Beijing Biomed Engineer, 2020, 39(1):48-55. DOI:10.3969/j.issn.1002-3208.2020.01.008.
[16] YUAN W, WEI J, WANG J, et al. Unified attentional generative adversarial network for brain tumor segmentation from multimodal unpaired images[C]. Granada:International Conference on Medical Image Computing and Computer-Assisted Intervention, 2018. DOI:10.1007/ 978-3-030-32248-9_26.
[17] JUE J, JASON H, NEELAM T, et al. Integrating cross-modality hallucinated MRI with CT to aid mediastinal lung tumor segmentation[J]. Med Image ComputComput Assist Interv, 2019, 11769:221-229. DOI:10.1007/978-3-030-32226-7_25.
[18] UDAY K, ABDUL MR, RAKIBUL H, et al. Lung cancer tumor region segmentation using recurrent 3D-DenseUNet[C/OL]. Macau:International Conference on Image and Video Processing, 2018. https://arxiv.org/ pdf/1812.01951.pdf.
[19] KOPELOWITZ E, ENGELHARD G. Lung nodules detection and segmentation using 3D Mask-RCNN[C/OL]. Shanghai:International Conference on Image and Video Processing, 2019. https://arxiv.org/pdf/1907.07676. pdf.