Contextual Text Understanding in Distributional Semantic Space

  • Jianpeng Cheng ,
  • Zhongyuan Wang ,
  • Ji-Rong Wen ,
  • Jun Yan ,
  • Zheng Chen

ACM International Conference on Information and Knowledge Management (CIKM) |

Published by ACM - Association for Computing Machinery

Representing discrete words in a continuous vector space turns out to be useful for natural language applications related to text understanding. Meanwhile, it poses extensive challenges, one of which is due to the polysemous nature of human language. A common solution (a.k.a word sense induction) is to separate each word into multiple senses and create a representation for each sense respectively. However, this approach is usually computationally expensive and prone to data sparsity, since each sense needs to be managed discriminatively. In this work, we propose a new framework for generating context-aware text representations without diving into the sense space. We model the concept space shared among senses, resulting in a framework that is efficient in both computation and storage. Specifically, the framework we propose is one that: i) projects both words and concepts into the same vector space; ii) obtains unambiguous word representations that not only preserve the uniqueness among words, but also reflect their context-appropriate meanings. We demonstrate the effectiveness of the framework in a number of tasks on text understanding, including word/phrase similaritymeasurements, paraphrase identification and question-answer relatedness classification.