Skip to main content

Table 1 Mutual Information formalism

From: Gene regulation network inference using k-nearest neighbor-based mutual information estimation: revisiting an old DREAM

Term

Symbol

Formula

Venn diagrams

Shannon’s entropy of X

H(X)

\(- \mathop \sum \limits_{x} p\left( x \right)\log p\left( x \right)\)

Joint entropy of X & Y

H(X,Y)

\(- \mathop \sum \limits_{x} \mathop \sum \limits_{y} p\left( {x,y} \right)\log p\left( {x,y} \right)\)

Joint entropy of X,Y & Z

H(X,Y,Z)

\(- \mathop \sum \limits_{x} \mathop \sum \limits_{y} \mathop \sum \limits_{z} p\left( {x,y,z} \right)\log p\left( {x,y,z} \right)\)

Two-way Mutual Information

MI(X;Y)

H(X) + H(Y) − H(X,Y)

Total Correlation

TC(X,Y,Z)

H(X) + H(Y) + H(Z) − H(X,Y,Z)

Three-way MI

MI3((X,Y);Z)

TC − MI(X;Y)

Interaction Information

II(X,Y,Z)

TC − MI(X;Y) − MI(X;Z) − MI(Y;Z)

Conditional MI

CMI(X;Y|Z)

TC − MI(X;Z) − MI(Y;Z)