Proceedings of Entropy 2021: The Scientific Tool of the 21st Century 2021
DOI: 10.3390/entropy2021-09757
|View full text |Cite
|
Sign up to set email alerts
|

Entropy analysis of n-grams and estimation of the number of meaningful language texts

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…In addition, we employ Entropy-1/2/3 (Malashina, 2021) to measure meaningful information in the generated responses. Since lexical-overlapping metrics have shown great limitations for text generation, we then employ model-based metrics for better evaluation, including BERTScore (BTS) (Sun et al, 2022) and FED (Mehri and Eskenazi, 2020).…”
Section: Methodsmentioning
confidence: 99%
“…In addition, we employ Entropy-1/2/3 (Malashina, 2021) to measure meaningful information in the generated responses. Since lexical-overlapping metrics have shown great limitations for text generation, we then employ model-based metrics for better evaluation, including BERTScore (BTS) (Sun et al, 2022) and FED (Mehri and Eskenazi, 2020).…”
Section: Methodsmentioning
confidence: 99%