Skip to main content

Table 5 Collection of pre-trained word embedding (WE and WEC) models and ontology-based vector models (OVM) evaluated in a previous series of experiments [58,59,60] by using the Java classes implementing their evaluation

From: HESML: a real-time semantic measures library for the biomedical domain with a reproducible survey

WN Family Word embedding model
Yes WEC Attract-repel [127]
No WE FastText [128]
No WE GloVe [129]
No WE CBOW [130]
Yes WEC SymPatterns (SP-500d) [131]
No WEC Paragram-ws [132]
No WEC Paragram-sl [132]
Yes WEC Counter-fitting (CF) [133]
Yes OVM WN-RandomWalks [134]
Yes OVM WN-UKB [125]
Yes OVM Nasari [126]
  1. First column details which methods use WordNet during their training