Title:
|
Transition-based dependency parsing with stack long short-term memory
|
Author:
|
Dyer, Chris; Ballesteros, Miguel; Ling, W; Matthews, A; Smith, Noah A.
|
Abstract:
|
We propose a technique for learning representations of parser states in transitionbased dependency parsers. Our primary innovation is a new control structure for sequence-to-sequence neural networks— the stack LSTM. Like the conventional stack data structures used in transitionbased parsing, elements can be pushed to/nor popped from the top of the stack in constant time, but, in addition, an LSTM maintains a continuous space embedding of the stack contents. This lets us formulate an efficient parsing model that captures three facets of a parser’s state: (i) unbounded look-ahead into the buffer of incoming words, (ii) the complete history of actions taken by the parser, and (iii) the complete contents of the stack of partially built tree fragments, including their internal/nstructures. Standard backpropagation techniques are used for training and yield state-of-the-art parsing performance. |
Abstract:
|
This work was sponsored in part by the U. S. Army Research Laboratory and the U. S. Army Research Office/nunder contract/grant number W911NF-10-1-0533, and in part by NSF CAREER grant IIS-1054319./nMiguel Ballesteros is supported by the European Commission under the contract numbers FP7-ICT-610411 (project MULTISENSOR) and H2020-RIA-645012 (project KRISTINA). |
Subject(s):
|
-Lingüística computacional -Tractament del llenguatge natural (Informàtica) |
Rights:
|
© ACL, Creative Commons Attribution-NonCommercial-ShareAlike 3.0 License
http://creativecommons.org/licenses/by-nc-sa/3.0/ |
Document type:
|
Conference Object Article - Published version |
Published by:
|
ACL (Association for Computational Linguistics)
|
Share:
|
|