We present a novel framework for the problem of transfer learning between few-shot source and target domains, using synthetic attributes in addition to convolutional neural networks that are pre-trained on larger image corpora. In these corpora, no labeled instances of the target domains are present, though they may contain instances of their superclasses. Using probabilistic inference over predicted classes and inferred attributes, we developed a metalearning ensemble method that builds upon that of [10]. This paper introduces the new framework BCAT (Between-Class Attribute Transfer), adapting inter-class attribute transfer designed for zeroshot learning (ZSL), combined with fusing transfer learning and probabilistic priors, and thereby extending and improving upon existing deep meta-learning models for FSL. We show how probabilistic learning architectures can be adapted to use state-of-the-field deep learning components in this framework. We applied our technique to four baseline convnet-based FSL ensembles and boosted accuracy by up to 6.24% for 1-shot learning and up to 4.11% for 5-shot learning on the mini-ImageNet dataset, the best result of which is competitive with the current state of the field; using the same technique, we improved accuracy by up to 7.83% for 1-shot learning and up to 3.67% for 5-shot learning on the tiered-ImageNet dataset.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.