Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition 2022
DOI: 10.1145/3573942.3573954
|View full text |Cite
|
Sign up to set email alerts
|

Effective Training-Time Stacking for Ensembling of Deep Neural Networks

Abstract: Ensembles are important tools for improving the performance of machine learning models. In cases related to natural language processing, ensembles boost the performance of a method due to multiple large models available in open source. However, existing approaches mostly rely on simple averaging of predictions by ensembles with equal weights for each model, ignoring differences in the quality and conformity of models. We propose to estimate weights for ensembles of NLP models using not only knowledge of their … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 34 publications
0
1
0
Order By: Relevance
“…In the present study, we adopted a stack-based ensemble technique, substantially increasing the Pearson correlation coefficient between the ML-estimated PV and the PDM-defined PV from 0.61 [38] to our value of 0.93. Despite its optimal performance, this ensemble technique requires extended training time and high computational resources [39]. To resolve this problem, we adopted two strategies.…”
Section: Discussionmentioning
confidence: 99%
“…In the present study, we adopted a stack-based ensemble technique, substantially increasing the Pearson correlation coefficient between the ML-estimated PV and the PDM-defined PV from 0.61 [38] to our value of 0.93. Despite its optimal performance, this ensemble technique requires extended training time and high computational resources [39]. To resolve this problem, we adopted two strategies.…”
Section: Discussionmentioning
confidence: 99%