2020
DOI: 10.1109/taslp.2020.3012060
|View full text |Cite
|
Sign up to set email alerts
|

Improving Sequence Modeling Ability of Recurrent Neural Networks via Sememes

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
2
2

Relationship

2
5

Authors

Journals

citations
Cited by 18 publications
(23 citation statements)
references
References 32 publications
0
22
0
Order By: Relevance
“…We use the official implementation of SememeCell (Qin et al, 2020) embedding size, we choose the medium version of BERT, which has 512-dimensional hidden vectors and 8 layers. 6 As for evaluation metrics, we use accuracy for both NLI and sentiment analysis.…”
Section: Experimental Settingsmentioning
confidence: 99%
See 3 more Smart Citations
“…We use the official implementation of SememeCell (Qin et al, 2020) embedding size, we choose the medium version of BERT, which has 512-dimensional hidden vectors and 8 layers. 6 As for evaluation metrics, we use accuracy for both NLI and sentiment analysis.…”
Section: Experimental Settingsmentioning
confidence: 99%
“…When deep learning becomes the mainstream approach of NLP, the usefulness of HowNet is also proved in diverse tasks including word representation learning (Sun and Chen, 2016;Niu et al, 2017), language modeling (Gu et al, 2018), semantic composition (Qi et al, 2019a), sequence modeling (Qin et al, 2020), reverse dictionary , word sense disambiguation (Hou et al, 2020), textual adversarial attacking and backdoor attacking (Qi et al, 2021).…”
Section: Hownet and Its Applicationsmentioning
confidence: 99%
See 2 more Smart Citations
“…It pre-defines a set of about 2, 000 sememes and uses them to annotate senses of more than 100, 000 Chinese words and phrases. In recent years, HowNet has been successfully applied to diverse natural language processing tasks such as language modeling (Gu et al, 2018), semantic composition (Qi et al, 2019a), sequence modeling (Qin et al, 2020), textual adversarial attack (Zang et al, 2020) and reverse dictionary .…”
Section: Introduction To Hownetmentioning
confidence: 99%