Sie sind hier

Multi-Sense Word Embeddings

 Fueling many recent advances in NLP, continuous word embeddings have received much attention over the last years. However, the popular word2vec implementation assumes a single vector per word type, an approach ignoring polysemy and homonymy. This talk will give an overview over extensions to this model that support multiple senses per word and discuss whether still some modeling flaws remain and whether they can be solved.

04.05.2017 - 10:15
B 017