Fig. 3From: Attention-based recurrent neural network for influenza epidemic predictionThe diagram of attention mechanism. Attention layer calculates the weighted distribution of X1, …, XT. The input of St contains the output of the attention layer. The probability of occurrence of the output sequence …, yt−1, yt, … depends on input sequence X1, X2, …, XT. hi represents the hidden vector. At,i represents the weight of ith input at time step tBack to article page