Although the use of the word 'information', with different meanings, can be traced back to antique and medieval texts (see Adriaans 2013), it is only in the 20 th century that the term begins to acquire the present-day sense. Nevertheless, the pervasiveness of the notion of information both in our everyday life and in our scientific practice does not imply the agreement about the content of the concept. As Luciano Floridi (2010Floridi ( , 2011 stresses, it is a polysemantic concept associated with different phenomena, such as communication, knowledge, reference, meaning, truth, etc. In the second half of the 20 th century, philosophy begins to direct its attention to this omnipresent but intricate concept in an effort of unravel the tangle of significances surrounding it.According to a deeply rooted intuition, information is related with data, it has or carries content. In order to elucidate this idea, the philosophy of information has coined the concept of semantic information (Bar-Hillel and Carnap 1953, Bar-Hillel 1964, Floridi 2013, strongly related with notions such as reference, meaning and representation: semantic information has intentionality −"aboutness"−, it is directed to other things. On the other hand, in the field of science certain problems are expressed in terms of a notion of information amenable to quantification. At present, this mathematical perspective for understanding information is manifested in different formalisms, each corresponding to its own concept: Fisher information (which measures the dependence of a random variable X on an unknown parameter θ upon which the probability of X depends; see Fisher 1925), algorithmic information (which measures the length of the shortest program that produces a string on a universal Turing machine; see, e.g., Chaitin 1987), von Neumann entropy (which gives a measure of the quantum resources necessary to faithfully encode the state of the source-system; see Schumacher 1995), among others.Nevertheless, it is traditionally agreed that the seminal work for the mathematical view of information is the paper where Claude Shannon (1948) introduces a precise formalism designed to solve certain specific technological problems in communication engineering (see also Shannon 2 and Weaver 1949). Roughly speaking, Shannon entropy is concerned with the statistical properties of a given system and the correlations between the states of two systems, independently of the meaning and any semantic content of those states. Nowadays, Shannon's theory is a basic ingredient of the communication engineers training.At present, the philosophy of information has put on the table a number of open problems related with the concept of information (see Adriaans and van Benthem 2008): the possibility of unification of various theories of information, the question about a logic of information, the relations between information and thermodynamics, the meaning of quantum information, the links between information and computation, among others. In this wide panoply of open issues, it ca...