2012 IEEE International Ultrasonics Symposium 2012
DOI: 10.1109/ultsym.2012.0423
|View full text |Cite
|
Sign up to set email alerts
|

Combined Naïve Bayes and logistic regression for quantitative breast sonography

Abstract: Sonography is commonly used as an adjunct to mammography for early detection of breast cancer. We are developing methods to classify solid breast masses in sonograms as malignant or benign. The goal of this study was to combine two independent probabilistic classifiers to improve computer-aided diagnosis of breast masses. Naïve Bayes and logistic regression were used for supervised classification of masses from extracted morphological sonographic features, in combination with mammographic BI-RADS (categories 1… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0
1

Year Published

2015
2015
2023
2023

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 17 publications
(26 reference statements)
0
9
0
1
Order By: Relevance
“…These nine features were motivated by the conventional visual features used in the clinic to assess lesions on ultrasound, described by Stavros et al [ 26 ]. They include the angular variation at the margin (AVM), angular variation of the interior (AVI), brightness difference (BD), margin sharpness (MS), tortuosity, depth-to-width ratio (DWR), axis ratio (AR), radius variation (RV), and ellipse-normalized skeleton (ENS) [ 9 , 14 , 23 ]. The formulas for deriving the features are given in the literature [ 9 , 10 ] and in the Appendix A Table A1 .…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…These nine features were motivated by the conventional visual features used in the clinic to assess lesions on ultrasound, described by Stavros et al [ 26 ]. They include the angular variation at the margin (AVM), angular variation of the interior (AVI), brightness difference (BD), margin sharpness (MS), tortuosity, depth-to-width ratio (DWR), axis ratio (AR), radius variation (RV), and ellipse-normalized skeleton (ENS) [ 9 , 14 , 23 ]. The formulas for deriving the features are given in the literature [ 9 , 10 ] and in the Appendix A Table A1 .…”
Section: Methodsmentioning
confidence: 99%
“…Although canonical AdaBoost boosts weak homogeneous learners, the motivation for boosting already-strong heterogeneous classifiers is that if one classifier already has very high performance, the optimal classification strategy is likely close to the already-strong strategy, so it is boosted with a learner that provides a second opinion. Previous studies found that naïve Bayes and logistic regression both performed well on the breast mass classification problem and, then, that they could be boosted even though their predictions were close case by case [ 10 , 20 , 21 , 23 ]. Though stacking is usually used instead of boosting with heterogeneous learners, the close agreement by case between our constituent learners makes them poor candidates for stacking.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…ML methods have also been used in many areas of medical science including breast cancer detection, cardiovascular risk prediction, stroke prediction, Parkinson's disease prediction, and medical imaging, to name a few [23,24,25,26,27,28,29]. For example, researchers in [29] use naive Bayes and logistic regression models for early detection of breast cancer. In the paper, the authors claim that computer-based quantitative methods improve diagnosis on breast ultrasound and have the potential to reduce the number of biopsies.…”
Section: Literature Reviewmentioning
confidence: 99%
“…The grayscale and morphological features were: brightness difference at the margin, margin sharpness, angular variation in brightness at the margin, depth-to-width ratio, axis ratio, tortuosity, radius variation, and elliptically normalized skeleton. 7,21 The features, F, were used with logistic regression to determine the probability of malignancy (P(MjF)). A round-robin (leave-oneout) approach was used to assess the discriminating capability of the extracted lesion features: N-1 samples of the N samples in the data were trained to predict the behavior of the remaining sample, and the process was repeated until each sample had been the test case.…”
Section: Feature Extraction and Classificationmentioning
confidence: 99%