Skip to main content

Advertisement

Fig. 3 | BMC Bioinformatics

Fig. 3

From: AIKYATAN: mapping distal regulatory elements using convolutional learning on GPU

Fig. 3

Training and Testing times and GPU speedup of DNN and CNN models. Figures 3a and 3b show the speed-up ratio for DNN and CNN, respectively. The orange line represents the speed-up ratio, training time using CPU divided by training time using GPU, for training set sizes varying from 1 GB to 8 GB. The speed-up ratio remained constant and the speed up is around 21x for DNN and 30x for CNN, respectively. Figures 3c and 3d shows how training time and testing time grows as training set size increases for DNN and CNN, when deployed on GPU. We fixed DNN and CNN architectures among all training sets and the number of learning epochs to be 50. Both DNN and CNN training times grow linearly when deployed on GPU

Back to article page