2012
DOI: 10.1073/pnas.1117723109
|View full text |Cite
|
Sign up to set email alerts
|

On the origin of long-range correlations in texts

Abstract: The complexity of human interactions with social and natural phenomena is mirrored in the way we describe our experiences through natural language. In order to retain and convey such a high dimensional information, the statistical properties of our linguistic output has to be highly correlated in time. An example are the robust observations, still largely not understood, of correlations on arbitrary long scales in literary texts. In this paper we explain how long-range correlations flow from highly structured … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
82
0
1

Year Published

2014
2014
2023
2023

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 95 publications
(87 citation statements)
references
References 36 publications
4
82
0
1
Order By: Relevance
“…When applied to natural language, our results show that the temporal organization of natural languages (with some differences between them) exhibits more complex structure than the sequences constructed by randomizations. These results are also concordant with previous studies, which report the presence of long-range correlations in written texts [33,34]. …”
Section: Multiscale Entropy Analysis Of Textssupporting
confidence: 83%
“…When applied to natural language, our results show that the temporal organization of natural languages (with some differences between them) exhibits more complex structure than the sequences constructed by randomizations. These results are also concordant with previous studies, which report the presence of long-range correlations in written texts [33,34]. …”
Section: Multiscale Entropy Analysis Of Textssupporting
confidence: 83%
“…However, real sentences are not random sequences of words, as research on long correlations in physics has been showing for more than a decade, e.g. [61,62]. Second, although such null model predictor has been tested previously on uniformly random trees [28], one cannot assume the predictor will work on real sentences given the substantial statistical differences between uniformly random trees and real syntactic dependency trees [18].…”
Section: Random Linear Arrangement With Some Knowledge About Depenmentioning
confidence: 99%
“…It has important applications beyond the traditional purview of physics, as well [1][2][3][4][5], including applications to music [4,6], genomics [7,8] and human languages [9][10][11][12].…”
Section: Introductionmentioning
confidence: 99%