1999
DOI: 10.1590/s0103-97331999000100005
|View full text |Cite
|
Sign up to set email alerts
|

Tsallis entropy and Jaynes' Information Theory formalism

Abstract: The role of Tsallis' non-extensive Information Measure within an a l a J a ynes Information-Theorybased formulation of Statistical Mechanics is discussed in rather detailed fashion.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
65
0
1

Year Published

2000
2000
2014
2014

Publication Types

Select...
5
5

Relationship

1
9

Authors

Journals

citations
Cited by 111 publications
(67 citation statements)
references
References 1 publication
(2 reference statements)
1
65
0
1
Order By: Relevance
“…Many other authors use MaxEnt in conjunction with different entropic forms for which one of the four Khinchin's axioms is modified (the axiom of extensivity) [38,39], but keeping the original variables, not changing them as we do.…”
Section: Beyond Proportional Growth Generalizing Equation (35)mentioning
confidence: 99%
“…Many other authors use MaxEnt in conjunction with different entropic forms for which one of the four Khinchin's axioms is modified (the axiom of extensivity) [38,39], but keeping the original variables, not changing them as we do.…”
Section: Beyond Proportional Growth Generalizing Equation (35)mentioning
confidence: 99%
“…In this paper, our discussion requires only the Jaynes Maximum entropy principle [13,14,15]. Jaynes' information theoretical approach to statistical mechanics based on the Shannon's extensive measure with a linear weighting of quantities as a mean value have successfully extended to Fisher's information measure( [6] and references therein).…”
Section: Introductionmentioning
confidence: 99%
“…During the last decades the scientific community paid considerable attention to the generalization of concepts such as information, entropy [1][2][3][4] and differentiation [5][6][7][8]. Entropy was introduced in thermodynamics by Clausius and Boltzmann and was later adopted by Shannon and Jaynes in information theory [9][10][11].…”
Section: Introductionmentioning
confidence: 99%