2018
DOI: 10.1021/acs.analchem.7b04399
|View full text |Cite
|
Sign up to set email alerts
|

Consensus Classification Using Non-Optimized Classifiers

Abstract: Classifying samples into categories is a common problem in analytical chemistry and other fields. Classification is usually based on only one method, but numerous classifiers are available with some being complex, such as neural networks, and others are simple, such as k nearest neighbors. Regardless, most classification schemes require optimization of one or more tuning parameters for best classification accuracy, sensitivity, and specificity. A process not requiring exact selection of tuning parameter values… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
57
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(58 citation statements)
references
References 49 publications
1
57
0
Order By: Relevance
“…In this study, a recent chemometric method for the classification of samples was applied. 15,25,26 Instead of using an algorithm that tests one by one, this method allows the combination of information from 17 classifiers with no requirements to select a threshold, eigenvector or number of neighbors. In addition, it is possible to easily fuse the data from different instruments.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this study, a recent chemometric method for the classification of samples was applied. 15,25,26 Instead of using an algorithm that tests one by one, this method allows the combination of information from 17 classifiers with no requirements to select a threshold, eigenvector or number of neighbors. In addition, it is possible to easily fuse the data from different instruments.…”
Section: Resultsmentioning
confidence: 99%
“…The hair samples were cosmetically modified at the laboratory. A recent chemometric method developed by Brownfield et al 15 for classifying these samples without further preprocessing that allowed the easy fusion of the data from LIBS and WDXRF was performed. The fusion of instrumental data has been shown to provide complementarity and improvements in analytical information for many purposes.…”
Section: Introductionmentioning
confidence: 99%
“…In view of the shortcomings of the empirical risk minimization strategy and the structural risk minimization strategy, several other methods have been proposed successively to improve overfitting, but they still cannot solve this problem well [11][12][13][14]. Brownfield et al [15] compared seven different nonparametric classifiers -radial basis function neural network, multilayer perceptron neural network, support vector machine, classification and regression tree, Chi-square automatic interaction detection, quick, unbiased and efficient statistical tree algorithm and random forest. The results showed that random forest has the highest accuracy, sensitivity and specificity.…”
Section: Related Workmentioning
confidence: 99%
“…More recently, a fusion strategy for non-optimized classifiers was proposed, i.e. by considering a window of tuning parameters values for each classifier in the fusion process [22].…”
Section: Introductionmentioning
confidence: 99%
“…For a given data block X, its corresponding output is the matrix AM X (Equation 3). When more than one X data blocks are available (like in the benchmark case presented in this work, where X = Vis, NIR, NMR), the resulting AM X matrices can be combined using, again, a sum rule ([22], equation 4).The result is the Fused Adjacency Matrix AM Fus , depicted in black inFigure 1. In this work, the values in AM Fus vary between zero and 42, as a result of summing a total of 42 AMs which have ones on their diagonal.…”
mentioning
confidence: 99%