2005
DOI: 10.1016/j.ejrad.2004.12.015
|View full text |Cite
|
Sign up to set email alerts
|

Significance analysis of qualitative mammographic features, using linear classifiers, neural networks and support vector machines

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0
10

Year Published

2006
2006
2015
2015

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(34 citation statements)
references
References 13 publications
0
24
0
10
Order By: Relevance
“…1. In (a), morphological shape types of breast cancer are depicted (adapted from [3]). In (b), a tumor with a stellated boundary (light gray), perpendicular to the viewing plane (e.g.…”
Section: Methodsmentioning
confidence: 99%
“…1. In (a), morphological shape types of breast cancer are depicted (adapted from [3]). In (b), a tumor with a stellated boundary (light gray), perpendicular to the viewing plane (e.g.…”
Section: Methodsmentioning
confidence: 99%
“…Most of the time, a second stage is necessary to reduce these features, because there are too numerous. In the third stage of the algorithm, the numerical descriptors are fed to classification algorithms, which are application-independent, such as Support Vector Machine [4,5,6], neural networks [2,6,7,8], k-nearest neighbors [9], etc. The classification algorithms will decide, depending on their entries, which is the class of the image.…”
Section: Classification Evaluationmentioning
confidence: 99%
“…2 Another classification technique that is widely used for the diagnosis of breast tumors are support vector machines which are known to be especially suited for dealing with classification data with a non-linear decision boundary. [3][4][5] One of the shortcomings is the black-box nature of the model, whereas in Bayesian networks statistical dependences and independences between features are represented. In this study we compare both classification methods and use two techniques, namely dimension reduction by principal component analysis (PCA) and normality transformation, to further improve the accuracy rate of the classifiers.…”
Section: Introductionmentioning
confidence: 99%