Sie sind hier

Deep Contextualized Word Embeddings

This talk will present the paper "Deep Contextualized Word Embeddings" by Peters et al, which won the best paper award at NAACL-HLT 2018. Its approach ELMo (Embeddings from Language Models) constitutes a fundamentally new way of representing words by considering linguistic context and achieved state-of-the-art results in six important NLP problems.
This talk will introduce the required background material (recurrent neural networks, word embeddings, and language models) and summarize the key contributions of the paper.

16.08.2018 - 10:15
B 016