Skip to content

Latest commit

 

History

History
64 lines (37 loc) · 2.76 KB

README.md

File metadata and controls

64 lines (37 loc) · 2.76 KB

Mutual Information

reeeeeedme

Estimating differential entropy and mutual information for continuous random variables.

Perhaps see:

⚠️ A reminder about Differential Entropy

  • It is not invariant under monotonic changes of variables (on the individual r.v.s.), and is therefore most useful with dimensionless variables. The equivalent invariance for discrete is bijective (relabelling) transformations of the individual r.v.s.
  • It can be negative.

See also the limiting density of discrete points as to why the original description of differential entropy is not even dimensionally correct.

Mutual Information

...

Install

python setup.py install

or

pip install pypi

Development

See Makefile for example ops.

See https://pypi.org/project/mutual-info

Do not pin packages for now. Let's surf latest and find out when things break.

Develop install

python setup.py develop

Tests

make test

TODO

  • should do some rank transform of the data or something to make it invariant to monotonic transforms of the data.
  • equation 3 and 9 from 2008 nips paper
  • tests
  • clear documentation and reminders about mutual information and the problems with continuous r.v.s
  • compare to sklearn _mutual_info.py as per #2

Origins

Originally adapted by G Varoquaux in a gist for code created by R Brette, itself from several papers (see in the code). These computations rely on nearest-neighbor (radial density) statistics.