2020
DOI: 10.1016/j.knosys.2020.105544
|View full text |Cite
|
Sign up to set email alerts
|

DGI: Recognition of Textual Entailment via dynamic gate Matching

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 4 publications
0
7
0
Order By: Relevance
“…e collaborative filtering recommendation combined with the neural network [2] (e.g., CNN, RNN, and CDAE) alleviates this problem. Besides, taking advantage of the productive relationship in the social networks [3][4][5] can effectively solve the cold-start problem [6,7], but there is a malicious fraud problem by distrusting users.…”
Section: Introductionmentioning
confidence: 99%
“…e collaborative filtering recommendation combined with the neural network [2] (e.g., CNN, RNN, and CDAE) alleviates this problem. Besides, taking advantage of the productive relationship in the social networks [3][4][5] can effectively solve the cold-start problem [6,7], but there is a malicious fraud problem by distrusting users.…”
Section: Introductionmentioning
confidence: 99%
“…Autoregression model is commonly used for forecasting univariate time series and VAR is a generalisation of the AR model for multiple time series and suitable for MIMO forecasting. (d) LSTM : It is a prominent RNN extension with forget gate, input gate, and output gate. LATM is widely used as the baseline of prediction [8, 17, 18, 24, 31]. This method can solve short‐term memory and vanishing gradient problem.…”
Section: Resultsmentioning
confidence: 99%
“…First, the infectiousness formula in the FIR layer can be modified to consider not just the quantity of postreplies but also their quality. For instance, natural language analysis [46] could be used to score the quality of posts. Second, the network model can be extended to have more than two layers.…”
Section: Discussionmentioning
confidence: 99%