Abstract
This study intends to explore the field of word embedding and thoroughly examine and contrast various word embedding algorithms. Words retain their semantic relationships and meaning when they are transformed into vectors using word embedding models. Numerous methods have been put forth, each with unique benefits and drawbacks. Making wise choices when using word embedding for NLP tasks requires an understanding of these methods and their relative efficacy. The study presents methodologies, potential uses of each technique and discussed advantages, disadvantages. The fundamental ideas and workings of well-known word embedding methods, such as Word2Vec, GloVe,FastText, contextual embedding ELMo, and BERT, areevaluated in this paper. The performance of thesealgorithms are evaluated for three datasets on the basis ofwords similarity and word analogy and finally results are compared.
View more >>