2008
DOI: 10.1007/978-3-540-85990-1_52
|View full text |Cite
|
Sign up to set email alerts
|

Atlas-Based Auto-segmentation of Head and Neck CT Images

Abstract: Abstract. Treatment planning for high precision radiotherapy of head and neck (H&N) cancer patients requires accurate delineation of many structures and lymph node regions. Manual contouring is tedious and suffers from large inter-and intra-rater variability. To reduce manual labor, we have developed a fully automated, atlas-based method for H&N CT image segmentation that employs a novel hierarchical atlas registration approach. This registration strategy makes use of object shape information in the atlas to h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

6
175
1
1

Year Published

2010
2010
2022
2022

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 148 publications
(183 citation statements)
references
References 11 publications
6
175
1
1
Order By: Relevance
“…In atlas-based methods a pre-computed segmentation or prior information in a template space is propagated towards the image to be segmented via spatial normalization (registration). These methods have been largely used in brain MRI ( [26], [27] ), head and neck CT Scans ( [28], [29], [30] ), cardiac aortic CT [31], pulmonary lobes from CT [32] and prostate MR ( [33], [34] ) . In the atlas based methods image registration is a key element, as label propagation relies on the registration of one or more templates to a target image.…”
Section: Introductionmentioning
confidence: 99%
“…In atlas-based methods a pre-computed segmentation or prior information in a template space is propagated towards the image to be segmented via spatial normalization (registration). These methods have been largely used in brain MRI ( [26], [27] ), head and neck CT Scans ( [28], [29], [30] ), cardiac aortic CT [31], pulmonary lobes from CT [32] and prostate MR ( [33], [34] ) . In the atlas based methods image registration is a key element, as label propagation relies on the registration of one or more templates to a target image.…”
Section: Introductionmentioning
confidence: 99%
“…These variabilities in target volume delineation can be a major primary source of inaccuracy of dose delivery and treatment errors [16]. Consequently, efforts have been made to identify processes in the target delineation process amenable to improvement, such as multimodality image incorporation [8,[17][18][19][20][21][22], instructional modification [23][24][25], visual atlas usage [11-15, 26, 27], window-level adjustment [28], autosegmentation [29,30], and software-assisted contouring [25]. While specialized data entry mechanism for spatial data is common in other arenas (e.g., video games [31] and virtual simulation workstations [32,33]), there have been comparatively few efforts to modify ROI definition at the hardware level in radiotherapy.…”
Section: Discussionmentioning
confidence: 99%
“…The corresponding results derived in adjacent slices could be collected for adjusting, so the performance can be improved. For the purpose of modeling, the dominant points can be applied for building the model mentioned in [14] and [15], and the polygonal approximation can also be deduced. Points 1 and 2 can be used for segmenting the vertebral body, and points 3 and 4 can be used for determine the facet corners.…”
Section: Discussionmentioning
confidence: 99%
“…The air path in relatively low brightness is adjacent to the cervical spine. In order to extract the spines in CT images, there have been several segmentation methods proposed, including modelbased segmentation, adaptive thresholding, multi-scale canny edge detection and active contour algorithm [1,[10][11][12][13][14][15]. Among these methods, gradient-based methods are considered to perform better accuracy than gray level thresholding because the magnitude and direction of the gradient can be used to accurately locate the edges.…”
Section: Vertebrae Extractionmentioning
confidence: 99%