Graph neural network (GNN) and label propagation algorithm (LPA) are both message passing algorithms, which have achieved superior performance in semi-supervised classification. GNN performs feature propagation by a neural network to make predictions, while LPA uses label propagation across graph adjacency matrix to get results. However, there is still no effective way to directly combine these two kinds of algorithms. To address this issue, we propose a novel Unified Message Passaging Model (UniMP) that can incorporate feature and label propagation at both training and inference time. First, UniMP adopts a Graph Transformer network, taking feature embedding and label embedding as input information for propagation. Second, to train the network without overfitting in self-loop input label information, UniMP introduces a masked label prediction strategy, in which some percentage of input label information are masked at random, and then predicted. UniMP conceptually unifies feature propagation and label propagation and is empirically powerful. It obtains new state-of-the-art semi-supervised classification results in Open Graph Benchmark (OGB).
Graph convolutional network (GCN) and label propagation algorithms (LPA) are both message passing algorithms, which have achieved superior performance in semi-supervised classification. GCN performs feature propagation by a neural network to make predictions, while LPA uses label propagation across graph adjacency matrix to get results. However, there is still no good way to combine these two kinds of algorithms. In this paper, we proposed a new Unified Massage Passaging model (UniMP) that can incorporate feature propagation and label propagation with a shared message passing network, providing a better performance in semi-supervised classification. First, we adopt a graph Transformer network jointly label embedding to propagate both the feature and label information. Second, to train UniMP without overfitting in self-loop label information, we propose a masked label prediction method, in which some percentage of training examples are simply masked at random, and then predicted. UniMP conceptually unifies feature propagation and label propagation and be empirically powerful. It obtains new state-of-the-art semi-supervised classification results in Open Graph Benchmark (OGB). Our implementation is available online https://github.com/PaddlePaddle/PGL/tree/main/ ogb_examples/nodeproppred/unimp.
Our previous studies indicated that tomato miR482b could negatively regulate the resistance of tomato to Phytophthora infestans and the expression of miR482b was decreased after inoculation with P. infestans. However, the mechanism by which the accumulation of miR482b is suppressed remains unclear. In this study, we wrote a program to identify 89 long noncoding RNA (lncRNA)-originated endogenous target mimics (eTMs) for 46 miRNAs from our RNA-Seq data. Three tomato lncRNAs, lncRNA23468, lncRNA01308 and lncRNA13262, contained conserved eTM sites for miR482b. When lncRNA23468 was overexpressed in tomato, miR482b expression was significantly decreased, and the expression of the target genes, NBS-LRRs, was significantly increased, resulting in enhanced resistance to P. infestans. Silencing lncRNA23468 in tomato led to the increased accumulation of miR482b and decreased accumulation of NBS-LRRs, as well as reduced resistance to P. infestans. In addition, the accumulation of both miR482b and NBS-LRRs was not significantly changed in tomato plants that overexpressed lncRNA23468 with a mutated eTM site. Based on the VIGS system, a target gene of miR482b, Solyc02g036270.2, was silenced. The disease symptoms of the VIGS-Solyc02g036270.2 tomato plants were in accordance with those of tomato plants in which lncRNA23468 was silenced after inoculation with P. infestans. More severe disease symptoms were found in the modified plants than in the control plants. Our results demonstrate that lncRNAs functioning as eTMs may modulate the effects of miRNAs in tomato and provide insight into how the lncRNA23468-miR482b-NBS-LRR module regulates tomato resistance to P. infestans.
Neural retrievers based on pre-trained language models (PLMs), such as dual-encoders, have achieved promising performance on the task of open-domain question answering (QA). Their effectiveness can further reach new stateof-the-arts by incorporating cross-architecture knowledge distillation. However, most of the existing studies just directly apply conventional distillation methods. They fail to consider the particular situation where the teacher and student have different structures. In this paper, we propose a novel distillation method that significantly advances cross-architecture distillation for dual-encoders. Our method 1) introduces a self on-the-fly distillation method that can effectively distill late interaction (i.e., ColBERT) to vanilla dual-encoder, and 2) incorporates a cascade distillation process to further improve the performance with a crossencoder teacher. Extensive experiments are conducted to validate that our proposed solution outperforms strong baselines and establish a new state-of-the-art on open-domain QA benchmarks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.