2021
DOI: 10.21203/rs.3.rs-237508/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Entropy analysis of n-grams and estimation of the number of meaningful language texts. Cyber security applications

Abstract: We estimate the n-gram entropies of English- language texts, using dictionaries and taking into account punctuation, and find a heuristic method for estimating the marginal entropy. We propose a method for evaluating the coverage of empirically generated dictionaries and an ap- proach to address the disadvantage of low coverage. In ad- dition, we compare the probability of obtaining a meaning- ful text by directly iterating through all possible n-grams of the alphabet and conclude that this is only possible fo… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 2 publications
(2 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?