The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
Proceedings of the 18th International Conference on World Wide Web 2009
DOI: 10.1145/1526709.1526720
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing diversity, coverage and balance for summarization through structure learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
79
0
1

Year Published

2010
2010
2020
2020

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 99 publications
(80 citation statements)
references
References 22 publications
0
79
0
1
Order By: Relevance
“…DivRank (Mei et al, 2010) is a generic graph ranking model that aims to balance high information coverage and low redundancy in top ranking vertices, which are also two key requirements for choosing salient summarization sentences (Li et al, 2009;Liu et al, 2015). Based on that, we present a model to rank and select salient messages from leader set V L to form a summary.…”
Section: Basic-leadsum Modelmentioning
confidence: 99%
“…DivRank (Mei et al, 2010) is a generic graph ranking model that aims to balance high information coverage and low redundancy in top ranking vertices, which are also two key requirements for choosing salient summarization sentences (Li et al, 2009;Liu et al, 2015). Based on that, we present a model to rank and select salient messages from leader set V L to form a summary.…”
Section: Basic-leadsum Modelmentioning
confidence: 99%
“…Subtopic coverage [29], max-marginal relevance (MMR) [4] and submodular coverage [17,16] are examples of this paradigm where the marginal utility is designed by hand. SVMdiv [28] and IndStrSVM [15] learn the marginal utility of subtopic coverage of documents from training data.…”
Section: Prior Artmentioning
confidence: 99%
“…In the learning-to-rank literature, Yue and Joachims [28] proposed a structured learning framework SVMdiv for diverse topic coverage, by using features that capture word coverage signals as surrogates of topic coverage. IndStrSVM [15] propose additional constraints to encourage diversity and balance appropriate for the specific application of summarization. SVMdiv and IndStrSVM stand out as among very few diversity approaches that learn from a powerful hypothesis space.…”
Section: Subtopic Coveragementioning
confidence: 99%
“…Although this is challenging even with modern natural language processing techniques, a combination of techniques has proven to be effective, e.g. [9,20], and offers an approximation for the amount of similarity and thus redundancy between two sentences.…”
Section: Measuring Redundancy Via Semantic Similaritymentioning
confidence: 99%