Minimum Fisher information
In information theory, the principle of minimum Fisher information is a variational principle which, when applied with the proper constraints needed to reproduce empirically known expectation values, determines the best probability distribution that characterizes the system.
Measures of information
are the most important tools of information theory. They measure either the amount of positive information or of "missing" information an observer possesses with regards to any system of interest. The most famous IM is the so-called Shannon-entropy , which determines how much additional information the observer still requires in order to have all the available knowledge regarding a given system S, when all he/she has is a probability density function defined on appropriate elements of such system. This is then a "missing" information measure. The IM is a function of the PDF only. If the observer does not have sucha PDF, but only a finite set of empirically determined mean values of the system, then a fundamental scientific principle called the Maximum Entropy one asserts that the "best" PDF is the one that, reproducing the known expectation values, maximizes otherwise Shannon's IM.
Fisher's information measure
, named after Ronald Fisher, is another kind of measure, in two respects, namely,1) it reflects the amount of information of the observer,
2) it depends not only on the PD but also on its first derivatives, a property that makes it a local quantity.
The corresponding counterpart of MaxEnt is now the FIM-minimization, since Fisher's measure grows when Shannon's diminishes, and vice versa. The minimization here referred to is an important theoretical tool in a manifold of disciplines, beginning with physics. In a sense it is clearly superior to MaxEnt because the later procedure yields always as the solution an exponential PD, while the MFI solution is a differential equation for the PD, which allows for greater flexibility and versatility.