Joint Unsupervised Learning of Semantic Representation of Words and Roles in Dependency Trees

Michal Konkol
Proceedings of the International Conference Recent Advances in Natural Language Processing, RANLP 2017 (2017)
BibTex  | PDF


In this paper, we introduce WoRel, a model that jointly learns word embeddings and a semantic representation of word relations. The model learns from plain text sentences and their dependency parse trees. The word embeddings produced by WoRel outperform Skip-Gram and GloVe in word similarity and syntactical word analogy tasks and have comparable results on word relatedness and semantic word analogy tasks. We show that the semantic representation of relations enables us to express the meaning of phrases and is a promising research direction for semantics at the sentence level.

Authors of the publication

Back to Top