2014
DOI: 10.1007/978-3-662-44654-6_2
|View full text |Cite
|
Sign up to set email alerts
|

Breaking Ties of Plurality Voting in Ensembles of Distributed Neural Network Classifiers Using Soft Max Accumulations

Abstract: Part 2: Learning-Ensemble LearningInternational audienceAn ensemble of distributed neural network classifiers is composed when several different individual neural networks are trained based on their local training data. These classifiers can provide either a single class label prediction, or the normalized via the soft max real value class-outputs that represent posterior probabilities which give the confidence levels. To form the ensemble decision the individual classifier decisions can be combined via the we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(6 citation statements)
references
References 19 publications
(18 reference statements)
0
6
0
Order By: Relevance
“…Nevertheless, we decided to include this comparison to demonstrate the difficulties of encountering ties in majority voting for patch-based CNN and investigate existing strategies to overcome this. It is interesting to observe (in table 3) that for patches extracted from non-scaled original Bark-101 (where there is a higher number of ties), the best tie-breaking strategy is the maximum confidence sum, as affirmed in [22] where the authors had tested it on simpler datasets (having a maximum of 26 classes in the Letter dataset) taken from the UCI repository [14]. To summarise, we present few important insights.…”
Section: Patches From Upscaled Imagesmentioning
confidence: 90%
See 4 more Smart Citations
“…Nevertheless, we decided to include this comparison to demonstrate the difficulties of encountering ties in majority voting for patch-based CNN and investigate existing strategies to overcome this. It is interesting to observe (in table 3) that for patches extracted from non-scaled original Bark-101 (where there is a higher number of ties), the best tie-breaking strategy is the maximum confidence sum, as affirmed in [22] where the authors had tested it on simpler datasets (having a maximum of 26 classes in the Letter dataset) taken from the UCI repository [14]. To summarise, we present few important insights.…”
Section: Patches From Upscaled Imagesmentioning
confidence: 90%
“…multiple classes can have the highest count of votes. In our study, we examine few tiebreaking strategies from existing literature in majority voting [22][23] [26][34] [35].…”
Section: Majority Votingmentioning
confidence: 99%
See 3 more Smart Citations