Shannon and Khinchin showed that assuming four information theoretic axioms the
entropy must be of Boltzmann-Gibbs type, S = ∑i pi log pi. Here we note that in
physical systems one of these axioms may be violated. For non-ergodic systems the
so called separation axiom (Shannon-Khinchin axiom 4) will in general not be valid.
We show that when this axiom is violated the entropy takes a more general form,
Sc,d ∝ ∑wi Γ(d + 1, 1 − c log pi), where c and d are scaling exponents and Γ a, b) is
the incomplete gamma function. The exponents (c, d) define equivalence classes for all
interacting and non-interacting systems and unambiguously characterize any statistical
system in its thermodynamic limit. The proof is possible because of two newly
discovered scaling laws which any entropic form has to fulfill, if the first three Shannon-
Khinchin axioms hold. (c, d) can be used to define equivalence classes of statistical
systems. A series of known entropies can be classified in terms of these equivalence
classes. We show that the corresponding distribution functions are special forms of
Lambert-W exponentials containing – as special cases – Boltzmann, stretched exponential
and Tsallis distributions (power-laws). In the derivation we assume trace form
entropies, S = ∑i g(pi), with g some function, however more general entropic forms
can be classified along the same scaling analysis.
Keywords: Generalized Entropies; Nonextensive Statistical Mechanics; Complex Systems;
Distribution Functions.