Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1588
|View full text |Cite
|
Sign up to set email alerts
|

Using Human Attention to Extract Keyphrase from Microblog Post

Abstract: This paper studies automatic keyphrase extraction on social media. Previous works have achieved promising results on it, but they neglect human reading behavior during keyphrase annotating. The human attention is a crucial element of human reading behavior. It reveals the relevance of words to the main topics of the target text. Thus, this paper aims to integrate human attention into keyphrase extraction models. First, human attention is represented by the reading duration estimated from eye-tracking corpus. T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
1
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 14 publications
(24 reference statements)
0
1
0
Order By: Relevance
“…Apart from feature fusion, the attention mechanism has been proven as an effective way to further improve the neural model's performance on keyphrase prediction. Yingyi Zhang and Chengzhi Zhang [34] consolidated the attention layer with the RNN model to get a better result than a single RNN.…”
Section: Keyphrase Extractionmentioning
confidence: 99%
See 1 more Smart Citation
“…Apart from feature fusion, the attention mechanism has been proven as an effective way to further improve the neural model's performance on keyphrase prediction. Yingyi Zhang and Chengzhi Zhang [34] consolidated the attention layer with the RNN model to get a better result than a single RNN.…”
Section: Keyphrase Extractionmentioning
confidence: 99%
“…Sentence Embeddings [29] including more semantic information Pre-trained embedding Embedrank [29] including more semantic information RNN RNN-base model [30] including context information Graph Convolutional Networks GCN DivGraphPointer [31] combination with graph-based method and convolutional network GRU + human attention human attention GRU [34] including context information and focusing on key information BERT + attention AttentionRank [35] Using pre-trained language model, not need to train a model from scratch…”
Section: Keyphrase Extraction Sentence Embeddingmentioning
confidence: 99%
“…Most previous work in gaze-supported NLP has used gaze as an input feature, e.g. for syntactic sequence labeling [36], classifying referential versus non-referential use of pronouns [82], reference resolution [30], key phrase extraction [86], or prediction of multi-word expressions [64]. Recently, Hollenstein et al [29] proposed to build a lexicon of gaze features given word types, overcoming the need for gaze data at test time.…”
Section: Gaze Integration In Neural Network Architecturesmentioning
confidence: 99%