SEARCH

SEARCH BY CITATION

Abstract

Distributional Semantic Models, which automatically induce word meaning representations from naturally occurring textual data, are a success story of computational linguistics. Recently, there has been much interest in whether such models, endowed with a compositional component, can also successfully approximate the meaning of phrases and sentences. In this article, mostly addressed to theoretical linguists curious about distributional semantics, I first discuss why developing compositional Distributional Semantic Models is an interesting and important pursuit, and then I survey current ideas about how this can be achieved.