Existing text sentiment analysis methods mostly rely on a large number of language knowledge and sentiment resources. This paper proposes the Multi-channel convolution and bidirectional GRU multi-head attention capsule(AT-MC-BiGRU-Capsule), which uses vector neurons to replace scalar neurons to model text emotions, and uses capsules to characterize text emotions. In addition, traditional methods cannot extract the multi-level features of text sequence well. Multi-head attention can encode the dependencies between words, capture sentiment words in text, and using Convolutional Neural Network(CNN) and Bidirectional gated recurrent unit network(Bi-GRU) to extract local features and global semantic features of text respectively, the global average pooling layer is introduced to obtain the multilevel feature representation of the text sequence more comprehensively. This paper selects three English datasets and one Chinese dataset in the general corpus of sentiment classification to conduct experiments, and achieves better results than other baseline models.
Recent years have seen great success in the use of neural seq2seq models on the text-to-SQL task. However, little work has paid attention to how these models generalize to realistic unseen data, which naturally raises a question: does this impressive performance signify a perfect generalization model, or are there still some limitations?In this paper, we first diagnose the bottleneck of the text-to-SQL task by providing a new testbed, in which we observe that existing models present poor generalization ability on rarely-seen data. The above analysis encourages us to design a simple but effective auxiliary task, which serves as a supportive model as well as a regularization term to the generation task to increase the models' generalization. Experimentally, We evaluate our models on a large text-to-SQL dataset WikiSQL. Compared to a strong baseline coarse-to-fine model, our models improve over the baseline by more than 3% absolute in accuracy on the whole dataset. More interestingly, on a zero-shot subset test of WikiSQL, our models achieve 5% absolute accuracy gain over the baseline, clearly demonstrating its superior generalizability.
A long-standing goal of the Human-Robot Collaboration (HRC) in manufacturing systems is to increase the collaborative working efficiency. In line with the trend of Industry 4.0 to build up the smart manufacturing system, the collaborative robot in the HRC system deserves better designing to be more selforganized and to find the superhuman proficiency by self-learning. Inspired by the impressive machine learning algorithms developed by Google Deep Mind like Alphago Zero, in this paper, the human-robot collaborative assembly working process is formatted into a chessboard and the selection of moves in the chessboard is used to analogize the decision-making by both human and robot in the HRC assembly working process. To obtain the optimal policy of the working sequence to maximize the working efficiency, agents in the system are trained with a self-play algorithm based on reinforcement learning, without guidance or domain knowledge beyond game rules. A convolution neural network (CNN) is also trained to predict the distribution of the priority of move selections and whether a working sequence is the one resulting in the maximum of the HRC efficiency. A height-adjustable standing desk assembly is used to demonstrate the proposed HRC assembly algorithm and its efficiency in real-time task planning. INDEX TERMS Human-Robot collaboration, real-time task planning, reinforcement learning, convolution neural network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.