Word embeddings beyond word2vec: GloVe, FastText, StarSpace

Advances in Robotics & Automation

ISSN: 2168-9695

Open Access

Word embeddings beyond word2vec: GloVe, FastText, StarSpace

6th Global Summit on Artificial Intelligence and Neural Networks

October 15-16, 2018 Helsinki, Finland

Konstantinos Perifanos

Argos, UK

Scientific Tracks Abstracts: Adv Robot Autom

Abstract :

Word embedding is a very convenient and efficient way to extract semantic information from large collections of textual or textual-like data. Essentially, word embedding NLP techniques where words from the vocabulary are mapped to a d-dimensional vector space. This transformation captures semantic similarity in the projected vector space, so semantically related words ideally will be very close. Here, we discuss a comparison of the performance of embeddingÔ??s techniques like word2vec and GloVe as well as fastText and StarSpace in NLP related problems such as metaphor and sarcasm detection as well as applications in non NLP related tasks, such as recommendation engines similarity.

Biography :

Konstantinos Perifanos has joined Argos in 2017 as a Lead Machine Learning Engineer. Prior to Argos, he worked at Royal Mail, Mailonline, Pearson and in research; he was involved in a broad range of projects from European FP6 research programs to EdTech, Analytics, Search, Predictive Modeling using Machine Learning and AI. He is interested in Deep Learning, Distributed Computing, Optimization, Search, Predictive Analytics and Natural Language Processing.

E-mail: [email protected]


arrow_upward arrow_upward