Font Size:

Power laws and maximum entropy

##manager.scheduler.building##: Edificio Santa Maria

##manager.scheduler.room##: Auditorio San Agustin

Date: 2019-07-10 12:00 PM – 03:45 PM

Last modified: 2019-06-14

#### Abstract

Crucial to a modern understanding of inference and statistics, the method of maximum entropy is a scheme for updating probability distributions. The goal is to find the posterior p(x), from a prior q(x), that maximizes a functional S[p|q] , namely entropy, under a set of constraints meant to represent the information on hand. Its successes -- which range from equilibrium Statistical Mechanics [Jaynes: Phys Rev 106, 620, 1957] to Bayesian statistics [Giffin and Caticha: AIP Conf. Proc. 872, 31, 2006 ] -- are due to the fact that entropy is designed as a universal tool for inference. Ever since Shannon [Bell Syst. Tech. J. 27-3, 379, 1948] there have been attempts [Shore and Johnson: IEEE Trans. Inf. Theory 26, 26, 1980] [Caticha: AIP Conf. Proc. 707, 75, 2004] [Vanslette: Entropy 19, 664, 2017] to derive the form of S from different design criteria. A modern derivation reduces these criteria to (a) local information should have only local effects, and (b) that a priori independent subsystems should remain independent, unless the constraints explicitly require otherwise. Not only are these traits fundamental for science, but they lead uniquely to an S of the Kullback and Leibler [Ann. Math. Stat. 22(1),79-86, 1951] form: - ∫ dx p(x) log (p(x)/q(x)).

A particular family of distributions that appear frequently in astronomy, social systems, and economics, are power laws. As it is known that maximizing Kullback-Leibler entropy under expected values constraints give canonical distributions, this situation has challenged the conection between power laws and maximum entropy. One solution is to `generalize' the functional form of S [Renyi: Berkeley Symp. on Math. Statist. and Prob. 1,547, 1961] [Tsallis: J. Stat. Phys. 52, 479, 1988] [Kaniadakis: Eur. Phys. J. B 70,3, 2009]. Although the maximization of Renyi, Tsallis and Kaniadakis entropies do, indeed, give power laws, these schemes breach the universal inference criteria of locality and independence.

Here we show how to obtain power laws from maximum (Kullback-Leibler) entropy from simple constraints, and also comment on the types of physical problems where such constraints appear.