Friday, April 28, 2017
11:45 AM
CSB 703
Xiaochang Peng
University of Rochester
Addressing the Data Sparsity Issue in Neural AMR Parsing
Abstract: Neural attention models have achieved great success in different NLP tasks. However, they have not fulfilled their promise on AMR parsing due to the data sparsity issue. In this talk, we describe a sequence-to-sequence model for AMR parsing and present different ways to tackle the data sparsity problem. We show that our methods achieve significant improvement over a baseline neural attention model and our results are also competitive against state-of-the-art systems that do not use extra linguistic resources.