Skip to main content
Fig. 3 | BMC Bioinformatics

Fig. 3

From: Attention-based recurrent neural network for influenza epidemic prediction

Fig. 3

The diagram of attention mechanism. Attention layer calculates the weighted distribution of X1, …, XT. The input of St contains the output of the attention layer. The probability of occurrence of the output sequence …, yt−1, yt, … depends on input sequence X1, X2, …, XT. hi represents the hidden vector. At,i represents the weight of ith input at time step t

Back to article page