Skip to main content

Advertisement

Fig. 2 | BMC Bioinformatics

Fig. 2

From: AIKYATAN: mapping distal regulatory elements using convolutional learning on GPU

Fig. 2

Comparison runtime for Aikyatan. Figures 2a and 2b show the training and testing times using CPU for the models, with varying training set sizes. As shown in Figure 2a, linear SVMs, DNNs, and CNNs training times scale approximately O(n) while random forests traing time grows at the rate of O(nlog(n)) and kernel SVMs training time grows at the rate of O(n2.2), where n denotes the number of training samples. As in Figure 2b, linear SVMs, DNNs, and CNNs testing times remained constant, whereas random forests testing time grows with the rate.(mn), where m denotes the number of trees, and kernel SVMs testing time grows rapidly as training size increases, with corresponding increase in SVs. Figure 2c shows the relationship between the number of SVs obtained from the training set and the testing time for the kernel SVM. For the kernel SVM, the testing time grows linearly with SVs

Back to article page