TY - JOUR
AB - We show that statistical criticality, i.e. the occurrence of power law frequency distributions, arises in samples that are maximally informative about the underlying generating process. In order to reach this conclusion, we first identify the frequency with which different outcomes occur in a sample, as the variable carrying useful information on the generative process. The entropy of the frequency, that we call relevance, provides an upper bound to the number of informative bits. This differs from the entropy of the data, that we take as a measure of resolution. Samples that maximise relevance at a given resolutionâ€”that we call maximally informative samplesâ€”exhibit statistical criticality. In particular, Zipf's law arises at the optimal trade-off between resolution (i.e. compression) and relevance. As a byproduct, we derive a bound of the maximal number of parameters that can be estimated from a dataset, in the absence of prior knowledge on the generative model.
Furthermore, we relate criticality to the statistical properties of the representation of the data generating process. We show that, as a consequence of the concentration property of the asymptotic equipartition property, representations that are maximally informative about the data generating process are characterised by an exponential distribution of energy levels. This arises from a principle of minimal entropy, that is conjugate of the maximum entropy principle in statistical mechanics. This explains why statistical criticality requires no parameter fine tuning in maximally informative samples.
AU - Cubero, Ryan J
AU - Jo, Junghyo
AU - Marsili, Matteo
AU - Roudi, Yasser
AU - Song, Juyong
ID - 7130
IS - 6
JF - Journal of Statistical Mechanics: Theory and Experiment
KW - optimization under uncertainty
KW - source coding
KW - large deviation
SN - 1742-5468
TI - Statistical criticality arises in most informative representations
VL - 2019
ER -