Skip to main content
Fig. 10 | BMC Bioinformatics

Fig. 10

From: Contrastive self-supervised clustering of scRNA-seq data

Fig. 10

Analysis of data augmentation techniques for neural network dropout (a–d) and input noise (e–h). Models masking a different ratio (0.2–0.9) of input genes through dropout NN layers have been trained on all real-world datasets. Both internal and external evaluators indicate that the performance increases with the NN dropout rate and peaks at the dropout rate = 0.9. Gaussian random noise has been added to the input data having a standard deviation from 0.01 to 1. Our results indicate that the addition of noise does not improve model performance. The annotated values represent the mean across all underlying experiments. For each setting 3 consecutive runs have been performed

Back to article page