Performance Evaluation of Word Embedding Algorithms


Authors : Shilpi Kulshretha; Lokesh Lodha

Volume/Issue : Volume 8 - 2023, Issue 12 - December

Google Scholar : http://tinyurl.com/2srx5nww

Scribd : http://tinyurl.com/32zt34uu

DOI : https://doi.org/10.5281/zenodo.10443962

Abstract : This study intends to explore the field of word embedding and thoroughly examine and contrast various word embedding algorithms. Words retain their semantic relationships and meaning when they are transformed into vectors using word embedding models. Numerous methods have been put forth, each with unique benefits and drawbacks. Making wise choices when using word embedding for NLP tasks requires an understanding of these methods and their relative efficacy. The study presents methodologies, potential uses of each technique and discussed advantages, disadvantages. The fundamental ideas and workings of well-known word embedding methods, such as Word2Vec, GloVe, FastText, contextual embedding ELMo, and BERT, are evaluated in this paper. The performance of these algorithms are evaluated for three datasets on the basis of words similarity and word analogy and finally results are compared.

Keywords : Embedding, Word2Vec, Global Vectors for Word Representation (GloVe), Embedding from Language Models (ELMo), BERT.

This study intends to explore the field of word embedding and thoroughly examine and contrast various word embedding algorithms. Words retain their semantic relationships and meaning when they are transformed into vectors using word embedding models. Numerous methods have been put forth, each with unique benefits and drawbacks. Making wise choices when using word embedding for NLP tasks requires an understanding of these methods and their relative efficacy. The study presents methodologies, potential uses of each technique and discussed advantages, disadvantages. The fundamental ideas and workings of well-known word embedding methods, such as Word2Vec, GloVe, FastText, contextual embedding ELMo, and BERT, are evaluated in this paper. The performance of these algorithms are evaluated for three datasets on the basis of words similarity and word analogy and finally results are compared.

Keywords : Embedding, Word2Vec, Global Vectors for Word Representation (GloVe), Embedding from Language Models (ELMo), BERT.

CALL FOR PAPERS


Paper Submission Last Date
31 - May - 2024

Paper Review Notification
In 1-2 Days

Paper Publishing
In 2-3 Days

Video Explanation for Published paper

Never miss an update from Papermashup

Get notified about the latest tutorials and downloads.

Subscribe by Email

Get alerts directly into your inbox after each post and stay updated.
Subscribe
OR

Subscribe by RSS

Add our RSS to your feedreader to get regular updates from us.
Subscribe