Skip to main content

Table 3 The effectiveness of multi level attention for event extraction

From: A biomedical event extraction method based on fine-grained and attention mechanism

Model

Precision (%)

Recall (%)

F-score (%)

Bi-LSTM

90.93 ± 0.35

38.50 ± 0.41

54.09 ± 0.37

Bi-LSTM + WAtt

90.75 ± 0.22

43.00 ± 0.25

58.35 ± 0.23

Bi-LSTM + SAtt

89.69 ± 0.23

44.12 ± 0.28

59.14 ± 0.27

Bi-LSTM + MultiAtt

90.24 ± 0.19

44.50 ± 0.16

59.61 ± 0.18

Bi-LSTM + MultiAtt + Fine-grained

91.05 ± 0.27

44.68 ± 0.31

59.94 ± 0.29

  1. WAtt Word level attention, SAtt Sentence level attention, MultiAtt Multi level attention