Skip to main content

Table 1 This table defines criteria used for method evaluation

From: Comparison of evolutionary algorithms in gene regulatory network model inference

Criteria Description
Goodness of data fit Best/average Mean Squared Error (MSE) between data and model over a number of runs. This measures the ability of the model to reproduce the experimental data
Identified interactions Ability of algorithm to qualitatively identify interactions (Sensitivity/Specificity). An interaction is taken to be identified if the corresponding parameter has an absolute value larger than zero.
, Average values over multiple runs are used for comparison purposes.
Parameter quality Best/average MSE between real parameters and algorithm solution over multiple runs. This measures the ability of the algorithm to find the exact parameters of a known model (important especially for underspecified systems.)
Robustness over multiple runs Average variance of kinetic orders/rate constants over multiple runs
Robustness to noise Performance of algorithm with noisy datasets: goodness of fit, identified interactions, parameter quality
Performance for real microarray data Sensitivity/Specificity and goodness of fit when applied to real microarray experiments rather than to synthetic data
Scalability Performance of algorithms with larger datasets, maximum dimensionality achieved, increase in running time and decrease in goodness of fit and identified parameter quality, (when moving from a smaller to a larger dataset)
Average running time Over a number of runs.
Function calls Average number of function calls required for the results obtained.