We are publishing volume 10 of Entropy. When I was a chemistry student I was facinated by thermodynamic problems, particularly the Gibbs paradox. It has now been more than 10 years since I actively published on this topic [1][2][3][4]. During this decade, the globalized Information Society has been developing very quickly based on the Internet and the term "information" is widely used, but what is information? What is its relationship with entropy and other concepts like symmetry, distinguishability and stability? What is the situation of entropy research in general? As the Editor-in-Chief of Entropy, I feel it is time to offer some comments, present my own opinions in this matter and point out a major flaw in related studies.
Definition of InformationWe are interested in the definition of information in the context of information theory. It is a surprise that a clear definition of the concept of "information" cannot be found in information theory textbooks. "Entropy as a measure of information" is confusing. I would like to propose a simple definition of information:Information ( I ) is the amount of the data after data compression.If the total amount of data is L, entropy ( S ) in information theory is defined as information loss, L S I = + . Let us consider a 100GB hard disk as an example: 100GB L =. A formatted hard disk will have 100GB S = and I = 0. Similar examples for defining information as the amount of data after compression are given in [5]. Based on this definition of information and the definition that (information theory) entropy is expressed as information loss, S L I = − , or in certain cases when the absolute values are unknown, S L I Δ = Δ − Δ , I was able to propose three laws of information theory [5]: