Autores
Gelbukh Alexander
Título Word Embeddings: A Comprehensive Survey
Tipo Revista
Sub-tipo CONACYT
Descripción Computación y Sistemas
Resumen This article is a systematic review of available studies in the area of word embeddings with an emphasis on classical matrix factorization techniques and contemporary neural word embedding algorithms such as Word2Vec, GloVe, and Bert. The efficiency and effectiveness of these methods for mapping semantic and lexical relationships are evaluated in greater detail providing analysis of the topology of these techniques. In addition, this approach demonstrates a model accuracy of 77%, which is 3% below the best human performance. At the same time the study has also shown the weaknesses of some models such as BERT, which lead to unrealistic high accuracy due to spurious correlations in the datasets. We see that there are three bottlenecks for the subsequent development of NLP algorithms: assimilation of inductive bias, common sense embedding, and generalization problem. The outcomes from this research help in enhancing the strength and applicability of word embeddings in natural language processing tasks. © 2024 Instituto Politecnico Nacional. All rights reserved.
Observaciones DOI 10.13053/CyS-28-4-5225
Lugar Ciudad de México
País Mexico
No. de páginas 2005-2029
Vol. / Cap. v. 28 no. 4
Inicio 2024-10-01
Fin
ISBN/ISSN