1. We evaluated the following measures on their correlation with the rate of evolution measured through experimental runs. From top to bottom: Multivariate mutual information (slope -0.169 ± 0.011, intercept 0.302 ± 0.031), multivariate conditional mutual information (slope 0.181 ± 0.012, intercept -0.12 ± 0.037), entropy, mutual information of time-shifted environments, average pair-wise Pearson correlation, average Kullback-Leibler divergence. Information theoretic metrics were calculated for probability distributions (using 64 bins) of possible values of two signals S 1 and S 2 and a nutrients' signal. Nutrient's signals were taken either as a time delayed function of input signals N (see Figure 3), or as a shifted by -500 time steps nutrients' signal N shifted (i.e. with eliminated time delay relative to the input signals). Information theoretic metrics were calculated as: MI(S 1 ; S 2 ; N) = H(S 1 ) + H(S 2 ) + H(N) - H(S 1 ;S 2 ) - H(S 1 ;N) - H(S 2 ;N) + H(S 1 ;S 2 ;N) and MI(S 1 ; S 2 |N) = MI(S 1 ;S 2 ) - MI(S 1 ; S 2 ; N); joint entropy is defined as $H\left({X}_{1},...,{X}_{n}\right)={\sum }_{{x}_{1}\in {X}_{1}\dots }{\sum }_{{s}_{n}\in {X}_{n}}p\left({x}_{1},...,{x}_{n}\right)log\left(p\left({x}_{1},...,{x}_{n}\right)\right)$ and mutual information is defined as $MI\left({S}_{1};{S}_{2}\right)={\sum }_{{s}_{1}\in {S}_{1}}{\sum }_{{s}_{2}\in {S}_{2}}p\left({s}_{1},{s}_{2}\right)log\left(\frac{p\left({s}_{1},{s}_{2}\right)}{{p}_{1}\left({s}_{1}\right){p}_{2}\left({s}_{2}\right)}\right)$, where (p(x1,...,x n ) and p i (x) are joint and marginal probability distribution functions, respectively. Kullback-Leibler divergence is calculated as: K-L Div. $\left(S||N\right)={\sum }_{i}p\left({S}_{i}\right)log\left(\frac{p\left({S}_{i}\right)}{p\left({N}_{i}\right)}\right)$ for probability distributions S and N. 