Segment-Level Sequence Modeling using Gated Recursive Semi-Markov Conditional Random Fields

  • Jingwei Zhuo ,
  • Yong Cao ,
  • Jun Zhu ,
  • Bo Zhang ,
  • Zaiqing Nie

Published by Association for Computational Linguistics (ACL)

Most of the sequence tagging tasks in natural language processing require to recognize segments with certain syntactic role or semantic meaning in a sentence. They are usually tackled with Conditional Random Fields (CRFs), which do indirect word-level modeling over word-level features and thus cannot make full use of segment-level information. Semi-Markov Conditional Random Fields (Semi-CRFs) model segments directly but extracting segment-level features for Semi-CRFs is still a very challenging problem. This paper presents Gated Recursive Semi-CRFs (grSemi-CRFs), which model segments directly and automatically learn segmentlevel features through a gated recursive convolutional neural network. Our experiments on text chunking and named entity recognition (NER) demonstrate that grSemi-CRFs generally outperform other neural models.