For deep learning, size is power. Massive neural nets trained on broad data for a spectrum of tasks are at the forefront of artificial intelligence. These foundation models or 'Jacks of All Trades' (JATs), when fine-tuned for downstream tasks, are gaining importance in driving deep learning advancements. However, environments with tight resource constraints, changing objectives and intentions, or varied task requirements, could limit the real-world utility of a singular JAT. Hence, in tandem with current trends towards building increasingly large JATs, this paper conducts an initial exploration into concepts underlying the creation of a diverse set of compact machine learning model sets. Composed of many smaller and specialized models, we formulate the Set of Sets to simultaneously fulfil many task settings and environmental conditions. A means to arrive at such a set tractably in one pass of a neuroevolutionary multitasking algorithm is presented for the first time, bringing us closer to models that are collectively 'Masters of All Trades'.
These are the results for an online replication of Saffran, Newport, & Aslin (1996) Word segmentation: The role of distributional cues, Journal of Memory and Language, 35, 606-621. This replication follows two online replications and an in-lab replication of the same experiment (Hartshorne 2017, Replication of Saffran, Newport, & Aslin (1996) Word segmentation: The role of distributional cues, Exp. 1. PsyArXiv doi:10.17605/OSF.IO/E5C64).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.