2018
DOI: 10.1017/s1351324918000414
|View full text |Cite
|
Sign up to set email alerts
|

Extractive multi-document summarization based on textual entailment and sentence compression via knapsack problem

Abstract: By increasing the amount of data in computer networks, searching and finding suitable information will be harder for users. One of the most widespread forms of information on such networks are textual documents. So exploring these documents to get information about their content is difficult and sometimes impossible. Multi-document text summarization systems are an aid to producing a summary with a fixed and predefined length, while covering the maximum content of the input documents. This paper presents a nov… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 62 publications
(78 reference statements)
0
5
0
Order By: Relevance
“…Moreover, fine-tuning BERT on MNLI or QNLI has obtained comparable performance. These results might be due to the fact that the latter are among high-quality labelled corpus, designed for textual entailment tasks, which might constitute a class of problems relevant to text summarization task [72,73]. Moreover, Fine-tuning BERT on single sentence classification tasks (CoLA and SST-2) has shown promising results.…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, fine-tuning BERT on MNLI or QNLI has obtained comparable performance. These results might be due to the fact that the latter are among high-quality labelled corpus, designed for textual entailment tasks, which might constitute a class of problems relevant to text summarization task [72,73]. Moreover, Fine-tuning BERT on single sentence classification tasks (CoLA and SST-2) has shown promising results.…”
Section: Discussionmentioning
confidence: 99%
“…The three most essential points the best summary must contain are coverage, non-redundancy, and relevance. To achieve such a summary, the authors [73] [74] used textual entailment relations and sentence compression by the Knapsack problem. It is used to address the extractive MDS problem.…”
Section: Miscelleneous Methodsmentioning
confidence: 99%
“…Knight and Marcu (2002) adapt the noisy channel used in Statistical Machine Translation in order to develop a method for sentence compression which is seen as a first step towards producing summaries automatically. The summarisation process was also seen as an optimisation problem in (Naserasadi, Khosravi, and Sadeghi, 2019) where weights are learnt from the data.…”
Section: Machine Learning Based Methodsmentioning
confidence: 99%