Institute for Web Science and Technologies · Universität Koblenz - Landau
Institute WeST

Multi-Sense Word Embeddings

[zur Übersicht]
Lukas Schmelzeisen

 Fueling many recent advances in NLP, continuous word embeddings have received much attention over the last years. However, the popular word2vec implementation assumes a single vector per word type, an approach ignoring polysemy and homonymy. This talk will give an overview over extensions to this model that support multiple senses per word and discuss whether still some modeling flaws remain and whether they can be solved.

04.05.17 - 10:15
B 016