Skip to main content

Table 5 Model (4, 5, 6), (7, 8, 9) architecture

From: Transfer learning for genotype–phenotype prediction using deep learning models

 

Model 4/5/6

Model 7/8/9

Layers

Parameters

Parameters

Layer 1—FullyConnected

Input layer

Input layer

Layer 2—GRU/LSTM/BILSTM

50

50

Layer 3—GRU/LSTM/BILSTM

20

20

Layer 4—FullyConnected

2

10

Layer 5—FullyConnected

5

Layer 6—FullyConnected

2

  1. The number of layers and the number of neurons in each layer can vary. Moreover, the hyper-parameters can be tuned to improve the final performance. The number of trainable and non-trainable layers can vary, but transfer learning does not perform well if all layers are trainable and the performance is improved