A feedforward neural network ensemble trained through metaheuristic algorithms has been proposed by researchers to produce a group of optimal neural networks. This method, however, has proven to be very time-consuming during the optimization process. To overcome this limitation, we propose a metaheuristic-based learning algorithm for building an ensemble system, resulting in shorter training time. In our proposed method, a master-slave based metaheuristic algorithm is employed in the optimization process to produce a group of heterogeneous feedforward neural networks, in which the global search operations are executed on the master, and the tasks of objective evaluation are distributed to the slaves (workers). To reduce evaluation costs, the entire training dataset is randomly divided equally into several disjoint subsets. Each subset is randomly paired with another subset of the remainder and distributed to a worker for the objective evaluation. Following the optimization process, representative candidate solutions (individuals) from the entire population are selected to perform as the base components of the ensemble system. The performance of the proposed method has been compared with those of other state-ofthe-art techniques in over 31 benchmark regression datasets taken from public repositories. The experimental results show that the proposed method not only reduces the computational time but also achieves significantly better prediction accuracy. Moreover, the proposed method achieved promising results in the application of a subset of the million song dataset, which identifies the release year of a song and predicts the buzz on Twitter. INDEX TERMS Neural network with random weights, feedforward neural network, ensemble learning, metaheuristic optimization, hybrid learning, encoding scheme, master-slave model, parallel computing.
Multi-label learning with emerging new labels is a practical problem that occurs in data streams and has become an important new research issue in the area of machine learning. However, existing models for dealing with this problem require high learning computational times, and there still exists a lack of research. Based on these issues, this paper presents an incremental kernel extreme learning machine for multi-label learning with emerging new labels, consisting of two parts: a novelty detector; and a multi-label classifier. The detector with free-user-setting threshold parameters was developed to identify instances with new labels. A new incremental multi-label classifier and its improved version were developed to predict a label set for each instance, which can add output units incrementally and update themselves in unlabeled instances. Comprehensive evaluations of the proposed method were carried out on the problems of multilabel classification with emerging new labels compared to comparative algorithms, which revealed the promising performance of the proposed method.INDEX TERMS Multi-label classification, multi-label learning with emerging new labels, extreme learning machine, class incremental learning, novelty detection, data stream classification.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.