2021 International Joint Conference on Neural Networks (IJCNN) 2021
DOI: 10.1109/ijcnn52387.2021.9534284
|View full text |Cite
|
Sign up to set email alerts
|

One Line To Rule Them All: Generating LO-Shot Soft-Label Prototypes

Abstract: Increasingly large datasets are rapidly driving up the computational costs of machine learning. Prototype generation methods aim to create a small set of synthetic observations that accurately represent a training dataset but greatly reduce the computational cost of learning from it. Assigning soft labels to prototypes can allow increasingly small sets of prototypes to accurately represent the original training dataset. Although foundational work on 'less than one'-shot learning has proven the theoretical plau… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…Previous research in both human and machine learning has treated one-shot learning, where the participant must learn a new concept from a single example, as the limit on sample-efficiency in supervised learning settings (Tiedemann et al, 2022;Fei-Fei et al, 2006a). Recent research in machine learning has shown that it is theoretically possible to learn more novel concepts than the number of presented examples, so-called less-than-one-shot (LO-shot) learning (Sucholutsky & Schonlau, 2021a;Sucholutskv et al, 2021), by associating examples with "soft labels" that describe their closeness to each concept as opposed to traditionally-used "hard labels" which associate each example with a single concept. LO-shot learning has recently been replicated in humans in a study that showed that participants presented with two examples of novel stimuli paired with soft labels relating them to three categories could infer the structure of those three categories (Malaviya et al, 2022).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous research in both human and machine learning has treated one-shot learning, where the participant must learn a new concept from a single example, as the limit on sample-efficiency in supervised learning settings (Tiedemann et al, 2022;Fei-Fei et al, 2006a). Recent research in machine learning has shown that it is theoretically possible to learn more novel concepts than the number of presented examples, so-called less-than-one-shot (LO-shot) learning (Sucholutsky & Schonlau, 2021a;Sucholutskv et al, 2021), by associating examples with "soft labels" that describe their closeness to each concept as opposed to traditionally-used "hard labels" which associate each example with a single concept. LO-shot learning has recently been replicated in humans in a study that showed that participants presented with two examples of novel stimuli paired with soft labels relating them to three categories could infer the structure of those three categories (Malaviya et al, 2022).…”
Section: Introductionmentioning
confidence: 99%
“…There are some limited pieces of empirical evidence that machines (Sucholutsky & Schonlau, 2021b;Sucholutskv et al, 2021) and humans (Malaviya et al, 2022) can perform LO-shot learning by leveraging soft labels that encode the relationship between each example and every known class. While people can somewhat understand and produce soft labels, and can even use them to communicate beliefs to AI systems (Collins, Barker, et al, 2023), they often find these labels unintuitive and difficult to interpret even in simple visual settings .…”
Section: Introductionmentioning
confidence: 99%