2023
DOI: 10.1140/epjds/s13688-022-00376-0
|View full text |Cite
|
Sign up to set email alerts
|

Socially disruptive periods and topics from information-theoretical analysis of judicial decisions

Abstract: Laws and legal decision-making regulate how societies function. Therefore, they evolve and adapt to new social paradigms and reflect changes in culture and social norms, and are a good proxy for the evolution of socially sensitive issues. Here, we use an information-theoretic methodology to quantitatively track trends and shifts in the evolution of large corpora of judicial decisions, and thus to detect periods in which disruptive topics arise. When applied to a large database containing the full text of over … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 38 publications
0
1
0
Order By: Relevance
“…In neuroscience, Shannon entropy of spike train distributions characterizes brain activity from neural responses [3], while mutual information identifies correlations between brain stimuli and responses [4]. Recently, the Kullback-Leibler divergence [5] and its regularized version, the Jensen-Shannon distance, have also been successfully used in a wide variety of contexts: in cognitive science as a measure of "surprise," to quantify and predict how human attention is oriented between changing screen images [6]; in quantitative social science, in combination with topic models, to track the propagation of political and social discourses [7,8] or to understand the emergence of social disruption from the analysis of judicial decisions [9]; and in machine learning, at the intersection between the statistical physics of diffusive processes, probabilistic models and deep neural networks [10].…”
Section: Introductionmentioning
confidence: 99%
“…In neuroscience, Shannon entropy of spike train distributions characterizes brain activity from neural responses [3], while mutual information identifies correlations between brain stimuli and responses [4]. Recently, the Kullback-Leibler divergence [5] and its regularized version, the Jensen-Shannon distance, have also been successfully used in a wide variety of contexts: in cognitive science as a measure of "surprise," to quantify and predict how human attention is oriented between changing screen images [6]; in quantitative social science, in combination with topic models, to track the propagation of political and social discourses [7,8] or to understand the emergence of social disruption from the analysis of judicial decisions [9]; and in machine learning, at the intersection between the statistical physics of diffusive processes, probabilistic models and deep neural networks [10].…”
Section: Introductionmentioning
confidence: 99%