Automatic Extraction of Compositional Matrix-Space Models of Language[go to overview]
Learning word representations in distributional semantic models to capture the semantics and compositionality of natural language is a central research area of computational linguistics. Compositional Matrix-Space Models (CMSMs) introduce a novel word representation alternative to Vector Space Models (VSMs). This talk presents the results of learning Compositional Matrix-Space Models to capture the semantics and compositionality in natural language processing tasks such as sentiment analysis of short phrases. Then, a new semantic relatedness dataset is introduced to evaluate compositional distributional semantic models on capturing semantic composition in language.
Shima is currently a PhD candidate in machine learning and natural language processing in the QuantLA (Quantitative Logics and Automata) research training group at TU Dresden. She has completed her Bachelor’s and Master’s degrees in computer science at Tehran Polytechnic, Iran.
07.11.19 - 10:15