2016
DOI: 10.1016/j.knosys.2015.11.005
|View full text |Cite
|
Sign up to set email alerts
|

Exploring events and distributed representations of text in multi-document summarization

Abstract: In this article, we explore an event detection framework to improve multidocument summarization. Our approach is based on a two-stage single-document method that extracts a collection of key phrases, which are then used in a centrality-as-relevance passage retrieval model. We explore how to adapt this single-document method for multi-document summarization methods that are able to use event information. The event detection method is based on Fuzzy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(20 citation statements)
references
References 20 publications
0
19
0
Order By: Relevance
“…Deep learning technology is applied in common NLP (natural language processing) tasks, such as semantic parsing [ 43 ], information retrieval [ 44 , 45 ], semantic role labeling [ 46 , 47 ], sentimental analysis [ 48 ], question answering [ 49 52 ], machine translation [ 53 – 56 ], text classification [ 57 ], summarization [ 58 , 59 ], and text generation [ 60 ], as well as information extraction, including named entity recognition [ 61 , 62 ], relation extraction [ 63 – 67 ], and event detection [ 68 – 70 ]. Convolution neural network and recurrent neural network are two popular models employed by this work [ 71 ].…”
Section: Reviewmentioning
confidence: 99%
“…Deep learning technology is applied in common NLP (natural language processing) tasks, such as semantic parsing [ 43 ], information retrieval [ 44 , 45 ], semantic role labeling [ 46 , 47 ], sentimental analysis [ 48 ], question answering [ 49 52 ], machine translation [ 53 – 56 ], text classification [ 57 ], summarization [ 58 , 59 ], and text generation [ 60 ], as well as information extraction, including named entity recognition [ 61 , 62 ], relation extraction [ 63 – 67 ], and event detection [ 68 – 70 ]. Convolution neural network and recurrent neural network are two popular models employed by this work [ 71 ].…”
Section: Reviewmentioning
confidence: 99%
“…In recent years, the research has also been focusing on distributed text representations, which uses embedded models to identify important text features by eliminating less crucial and redundant pieces of information [92]. This allows to model representation optimising the trade-off between training performance and domain portability [60].…”
Section: Website Content Modellingmentioning
confidence: 99%
“…These powerful, efficient models have shown very promising results in capturing both semantic and syntactic relationships between words in large-scale text corpora, and obtained state-of-the-art results on many NLP tasks. Recently, the concept of embedding has been expanded to many applications, including sentences and paragraphs representation [11], summarization [21], questions answering [43], recommender systems [34] and so on.…”
Section: Embeddingmentioning
confidence: 99%