2019
DOI: 10.1007/s10278-018-00173-0
|View full text |Cite
|
Sign up to set email alerts
|

Automatic Nasopharyngeal Carcinoma Segmentation Using Fully Convolutional Networks with Auxiliary Paths on Dual-Modality PET-CT Images

Abstract: Nasopharyngeal carcinoma (NPC) is prevalent in certain areas, such as South China, Southeast Asia, and the Middle East. Radiation therapy is the most efficient means to treat this malignant tumor. Positron emission tomography-computed tomography (PET-CT) is a suitable imaging technique to assess this disease. However, the large amount of data produced by numerous patients causes traditional manual delineation of tumor contour, a basic step for radiotherapy, to become time-consuming and labor-intensive. Thus, t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0
1

Year Published

2019
2019
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 53 publications
(28 citation statements)
references
References 25 publications
0
27
0
1
Order By: Relevance
“…Zhao et al presented a method that used 2D FCNN with dual-modality PET-CT images of 30 patients to achieve automated segmentation of NPC. The deep supervision adding auxiliary paths could explicitly direct the training of lower layers and learn more representative features to improve the performance of the model (Zhao et al, 2019). Ma et al investigated an enhanced CNN method for automated NPC segmentation using CT and MRI.…”
Section: Introductionmentioning
confidence: 99%
“…Zhao et al presented a method that used 2D FCNN with dual-modality PET-CT images of 30 patients to achieve automated segmentation of NPC. The deep supervision adding auxiliary paths could explicitly direct the training of lower layers and learn more representative features to improve the performance of the model (Zhao et al, 2019). Ma et al investigated an enhanced CNN method for automated NPC segmentation using CT and MRI.…”
Section: Introductionmentioning
confidence: 99%
“…It shows theoretically that combining multiple views can obtain abundant information for latent intact space learning Multiview discriminant analysis with view consistency (MvDA-VC) [ 16 ]: it seeks for a single discriminant common space for multiple views in a nonpairwise manner by jointly learning multiple view-specific linear transforms. MvDA-VC method has achieved good performance in addressing the problem of object recognition from multiple views Zhao et al [ 12 ]: it uses fully convolutional networks with an auxiliary path to achieve automatic segmentation of NPC on dual-modality PET-CT images. The proposed method improves NPC segmentation by guiding the training of lower layers by auxiliary paths Li et al [ 13 ]: it proposes a modified version of the U-Net, which performs well on NPC segmentation by modifying the downsampling layers and upsampling layers to have a similar learning ability and predict the same spatial resolution as the source image …”
Section: Experiments and Resultsmentioning
confidence: 99%
“…[ 11 ] reported an automatic NPC segmentation method based on the convolutional neural network (CNN) architecture with dynamic contrast-enhanced MRI. [ 12 ] used fully convolutional networks with auxiliary paths to achieve automatic segmentation of NPC on PET-CT images. [ 13 ] used a modified U-Net model to automatically segment NPC on CT images from 502 patients.…”
Section: Introductionmentioning
confidence: 99%
“…Before planned radiotherapy, it is necessary to precisely quantify the target structures by segmentation which, in the case of nasopharyngeal carcinomas, is often a particularly difficult and time-consuming activity because of the anatomic location. Zhao et al showed, for a small group of 30 patients, that the automatic segmentation of such tumors on 18 F-FDG PET/CT data was, in principle, possible using the U-Net architecture (mean Dice score of 87.47%) (44). Other groups applied similar approaches to head and neck cancer (45) and lung cancer (46,47).…”
Section: Discussionmentioning
confidence: 99%