2017
DOI: 10.1007/s12559-017-9507-z
|View full text |Cite
|
Sign up to set email alerts
|

Learning from Few Samples with Memory Network

Abstract: Background. Neural Networks (NN) have achieved great successes in pattern recognition and machine learning. However, the success of a NN usually relies on the provision of a sufficiently large number of data samples as training data. When fed with a limited data set, a NN's performance may be degraded significantly. Methods. In this paper, a novel NN structure is proposed called a Memory Network. It is inspired by the cognitive mechanism of human beings, which can learn effectively, even from limited data. Tak… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 16 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…For this reason, we plan to explore the combination of advanced feature extraction technique [52] and MTL strategy. Furthermore, obtaining a joint recognition with a small amount of label data is also valuable, which might be achieved by integrating the MTL with the approach of learning from few examples [53].…”
Section: Discussionmentioning
confidence: 99%
“…For this reason, we plan to explore the combination of advanced feature extraction technique [52] and MTL strategy. Furthermore, obtaining a joint recognition with a small amount of label data is also valuable, which might be achieved by integrating the MTL with the approach of learning from few examples [53].…”
Section: Discussionmentioning
confidence: 99%
“…Overall, both models have their strengths and weaknesses, and the choice depends on the specific research question being addressed. [26,[41][42][43][44].…”
Section: ) Spiking Neural Network (Snn) Models Of Hippocampus  Neura...mentioning
confidence: 99%
“…Next, to obtain the "generalized Rayleigh quotient" of semi-supervised LDA, the new betweenclass scatter matrix (8) where (9) where i denotes the element subscript of vector j h .…”
Section: Semi-supervised Ldamentioning
confidence: 99%
“…Nevertheless, the sample labeling for RS images is time-consuming, and the robustness of the classifier is worse when the labelled samples are insufficient [7]. But unlabeled samples can be generated in a relatively easier way and their rich characteristic information is helpful for improving the classification performance [8]. Semi-supervised learning can automatically exploits unlabeled samples to improve the learning performance based on labeled samples.…”
Section: Introductionmentioning
confidence: 99%