2020
DOI: 10.1007/s10618-020-00704-w
|View full text |Cite
|
Sign up to set email alerts
|

Active learning for hierarchical multi-label classification

Abstract: Due to technological advances, a massive amount of data is produced daily, presenting challenges for application areas where data needs to be labelled by a domain specialist or by expensive procedures, in order to be useful for supervised machine learning purposes. In order to select which data points will provide more information when labelled, one can make use of active learning methods. Active learning (AL) is a subfield of machine learning which addresses methods to build models with fewer, but more repres… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
13
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(13 citation statements)
references
References 61 publications
0
13
0
Order By: Relevance
“…Uncertainty sampling methods are computationally efficient. They have shown good empirical performance, even though they do not measure the future predictive informativeness of the candidate instance on the large amounts of unlabelled data [ 30 ]. Using the evolutionary algorithm USPEX and machine-learning interatomic potentials actively learning on the fly, Podryabinkin et al [ 31 ] proposed a method for crystal structure prediction.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Uncertainty sampling methods are computationally efficient. They have shown good empirical performance, even though they do not measure the future predictive informativeness of the candidate instance on the large amounts of unlabelled data [ 30 ]. Using the evolutionary algorithm USPEX and machine-learning interatomic potentials actively learning on the fly, Podryabinkin et al [ 31 ] proposed a method for crystal structure prediction.…”
Section: Related Workmentioning
confidence: 99%
“…On the other side, we have a technique that uses a selection criterion , such as an uncertainty measure, to choose a group of unlabelled cases. An AL procedure typically includes the following steps [ 30 , 42 , 43 ]: choose examples from (unlabelled); An annotator categorizes the chosen unlabelled instances; Examples that were chosen are appended to then deleted from ; is trained using the labelled set ; The evaluation of performance for classifier is estimated; Go to step 1, if no stopping condition. …”
Section: 2 Active Learning Modellingmentioning
confidence: 99%
“…Most work on active learning from sequential data focuses on simple shallow classifiers (e.g., linear separators) and neglects the issues of learning in the wild (Hoi et al 2021). Active learning with partial labels has been studied in the context of Bayesian networks (Tong and Koller 2000), maxmargin predictors (Roth and Small 2006;Small and Roth 2010), and other models (Sun, Laddha, and Batra 2015;Mo, Scott, and Downey 2016;Liu and Ferrari 2017;Khodabandeh et al 2017;Hu et al 2018;Behpour, Liu, and Ziebart 2019;Ning et al 2019;Nakano, Cerri, and Vens 2020). Closest to our work, Platanios, Kapoor, and Horvitz ( 2017) also develop selection heuristics for picking informative example/sub-label pairs based on entropy reduction.…”
Section: Related Workmentioning
confidence: 86%
“…Zhang et al propose HiMeCat [23] an embedding-based generative framework with a joint representation learning module to categorize documents into a given label hierarchy under weak supervision. The works in [26] propose a active learning approach for HMC problem. Aly et al propose a capsule network based method [27] for HMC problem.…”
Section: Related Workmentioning
confidence: 99%