The goal of this book is to offer an updated overview on generalized information measures and statistics, including the basic concepts as well as some recent relevant applications.
The book begins with an historical introduction describing the fascinating development of the concepts of heat and entropy. Starting from the ideas of the ancient Greece, an account of the main historical breakthroughs is provided, which allows to appreciate the fundamental contributions of Nicolas Sadi Carnot, Rudolf Clausius, Ludwig Boltzmann, Josiah Willard Gibbs and others. It ends with the seminal works of Claude Shannon, which led to the foundation of Information Theory, and Edwin Jaynes, which provided the connection of the latter with Statistical Mechanics.
Concepts and Recent Advances in Generalized Information Measures and Statistics
Chapter 7, pp. 147-168,
A Statistical Measure of Complexity
Ricardo López-Ruiz, Héctor Mancini and Xavier Calbet
In this chapter, a statistical measure of complexity is introduced and some of its properties are discussed. Also, some straightforward applications are shown. This approach is precisely presented by Ricardo Lopez-Ruiz and collaborators that introduce the well-known measure of complexity known by their surnames (LMC Statistical Measure of Complexity). Its properties are discussed in full detail and some interesting applications (gaussian and exponential distributions, and complexity in a two-level laser model) are also provided.