Table 2 Information metrics of environmental complexity.

Metric

MI (S 1 ; S 2 ; N shifted )

0.984

MI (S 1 ; S 2 |N shifted )

0.981

H (S 1 ; S 2 ; N shifted )

0.028

MI (S 1 ; S 2 ; N)

-0.249

[Corr. (S 1 , N shifted ) + Corr. (S 1 , N shifted )]/2

-0.332

[K-L Div. (S 1 ||N shifted ) + K-L Div. (S 1 ||N shifted )]/2

-0.106

1. We evaluated the following measures on their correlation with the rate of evolution measured through experimental runs. From top to bottom: Multivariate mutual information (slope -0.169 Â± 0.011, intercept 0.302 Â± 0.031), multivariate conditional mutual information (slope 0.181 Â± 0.012, intercept -0.12 Â± 0.037), entropy, mutual information of time-shifted environments, average pair-wise Pearson correlation, average Kullback-Leibler divergence. Information theoretic metrics were calculated for probability distributions (using 64 bins) of possible values of two signals S 1 and S 2 and a nutrients' signal. Nutrient's signals were taken either as a time delayed function of input signals N (see Figure 3), or as a shifted by -500 time steps nutrients' signal N shifted (i.e. with eliminated time delay relative to the input signals). Information theoretic metrics were calculated as: MI(S 1 ; S 2 ; N) = H(S 1 ) + H(S 2 ) + H(N) - H(S 1 ;S 2 ) - H(S 1 ;N) - H(S 2 ;N) + H(S 1 ;S 2 ;N) and MI(S 1 ; S 2 |N) = MI(S 1 ;S 2 ) - MI(S 1 ; S 2 ; N); joint entropy is defined as $H\left({X}_{1},...,{X}_{n}\right)={âˆ‘}_{{x}_{1}âˆˆ{X}_{1}â€¦}{âˆ‘}_{{s}_{n}âˆˆ{X}_{n}}p\left({x}_{1},...,{x}_{n}\right)log\left(p\left({x}_{1},...,{x}_{n}\right)\right)$ and mutual information is defined as $MI\left({S}_{1};{S}_{2}\right)={âˆ‘}_{{s}_{1}âˆˆ{S}_{1}}{âˆ‘}_{{s}_{2}âˆˆ{S}_{2}}p\left({s}_{1},{s}_{2}\right)log\left(\frac{p\left({s}_{1},{s}_{2}\right)}{{p}_{1}\left({s}_{1}\right){p}_{2}\left({s}_{2}\right)}\right)$, where (p(x1,...,x n ) and p i (x) are joint and marginal probability distribution functions, respectively. Kullback-Leibler divergence is calculated as: K-L Div. $\left(S||N\right)={âˆ‘}_{i}p\left({S}_{i}\right)log\left(\frac{p\left({S}_{i}\right)}{p\left({N}_{i}\right)}\right)$ for probability distributions S and N.