2010
DOI: 10.1007/978-3-642-17187-1_23
|View full text |Cite
|
Sign up to set email alerts
|

Effectively Leveraging Entropy and Relevance for Summarization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 9 publications
0
6
0
Order By: Relevance
“…Empirical correlation between sentence length and sentence entropy (calculated from matrix A) for four data-sets is shown in Row 1 of Table 1. As conjectured by Luo et al (2010), there is a high positive correlation between sentence length and sentence entropy in term space.…”
Section: (I) Sentence Entropy In Term Spacementioning
confidence: 75%
See 3 more Smart Citations
“…Empirical correlation between sentence length and sentence entropy (calculated from matrix A) for four data-sets is shown in Row 1 of Table 1. As conjectured by Luo et al (2010), there is a high positive correlation between sentence length and sentence entropy in term space.…”
Section: (I) Sentence Entropy In Term Spacementioning
confidence: 75%
“…Normalizing the weights to obtain probability distribution, authors map summary words to the concepts and calculate entropy of the summary for quantitative assessment of its quality. Luo et al (2010) conjecture that sentence entropy proxies for coverage of information by the sentence. Authors consider sentence as a vector of terms (content words) in the document and compute probability distribution of terms, which is used for calculating entropy of the sentence.…”
Section: Entropy For Document Summarizationmentioning
confidence: 99%
See 2 more Smart Citations
“…Based on entropy-related theories, we define various entropies, such as absolute alarm-message entropy, Term Frequency-Inverse Document Frequency (TF-IDF) alarm-message entropy, relative alarm-message entropy, self-information of monitoring alarm message, and average alarm-message entropy [32] to measure the importance of every piece of alarm message in order to obtain an overall picture of the content of the message. The absolute alarm-message entropy is defined from the perspective of the word frequency of a single sentence, while the TF-IDF alarm-message entropy and relative alarm-message entropy are defined considering the overall message base.…”
Section: Definition Of Alarm-message Entropiesmentioning
confidence: 99%