2014
DOI: 10.1007/s00799-014-0121-3
|View full text |Cite
|
Sign up to set email alerts
|

Information-theoretic term weighting schemes for document clustering and classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(6 citation statements)
references
References 26 publications
0
6
0
Order By: Relevance
“…It meets the metric properties of non-negativity, identity of indiscernibles, and symmetry. Additionally, its cube root DLIT E 1 3 satisfies the property of triangular inequality and is a metric distance.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…It meets the metric properties of non-negativity, identity of indiscernibles, and symmetry. Additionally, its cube root DLIT E 1 3 satisfies the property of triangular inequality and is a metric distance.…”
Section: Discussionmentioning
confidence: 99%
“…In our prior work, we proposed the Least Information Theory (LIT) to quantify the amount of entropic difference between two probability distributions [3]. Given probability distributions P and Q of the same variable X, LIT is computed by:…”
Section: Least Information Theory (Lit)mentioning
confidence: 99%
See 2 more Smart Citations
“…However, KL divergence is not a metric and cannot be used as a symmetric distance measure. In addition, KL is unbounded and has undesirable consequences in practical applications, where extremely large value can dominate the scoring function [4], [5]. Research has proposed the Discounted Least Information Theory (DLITE) as an alternative to mitigate some of these issues [7].…”
Section: Introductionmentioning
confidence: 99%