2018
DOI: 10.18466/cbayarfbe.384729
|View full text |Cite
|
Sign up to set email alerts
|

Fully Automated and Adaptive Intensity Normalization Using Statistical Features for Brain MR Images

Abstract: Accuracy of the results obtained by automated processing of brain magnetic resonance images has vital importance for diagnosis and evaluation of a progressive disease during treatment. However, automated processing methods such as segmentation, registration and comparison of these images are challenging issues. Because intensity values do not only depend on the underlying tissue type. They can change due to scanner-related artifacts and noise, which usually occurs in magnetic resonance images. In addition to i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
10

Relationship

0
10

Authors

Journals

citations
Cited by 30 publications
(10 citation statements)
references
References 41 publications
0
10
0
Order By: Relevance
“…Comparing to the recent methods, the star priority-based melanoma segmentation method has further improved the performance of the segmentation model on the data set with fewer samples. An advantage of the proposed approach is that a separate intensity normalization stage ( Goceri, 2018 ), which usually leads to increase the computational complexity, is not needed.…”
Section: Resultsmentioning
confidence: 99%
“…Comparing to the recent methods, the star priority-based melanoma segmentation method has further improved the performance of the segmentation model on the data set with fewer samples. An advantage of the proposed approach is that a separate intensity normalization stage ( Goceri, 2018 ), which usually leads to increase the computational complexity, is not needed.…”
Section: Resultsmentioning
confidence: 99%
“…In the proposed approach, although the input images are noisy and include in‐homogeneous intensity values, there is no extra denoising method and intensity normalization such as in refs. [34–36], which may also cause increase in computational costs. frelufalse(xfalse)=relufalse(xgoodbreak+afalse)+b$$\begin{equation} frelu(x)=relu(x+a)+b \end{equation}$$…”
Section: Methodsmentioning
confidence: 99%
“…Capsule neural network 85.69% [54] Deep-learning Model 70-80% [55][56][57] 2021, 2018, 2019 Deep Learning 85% [58] Deep Learning 89% [59] CNN 80.2% [60] 5-layer CNN 95% [61] 14-layer CNN 97.78% [62] Distributed Learning System (DLS) 99.77% [63] Deep Residual Network 88.7%. [64] Local binary pattern (LBP) Resnet-50 and DenseNet-121…”
Section: Related Workmentioning
confidence: 99%