pith. machine review for the scientific record. sign in

arxiv: 1504.04666 · v1 · submitted 2015-04-18 · 💻 cs.CL · cs.LG

Recognition: unknown

Unsupervised Dependency Parsing: Let's Use Supervised Parsers

Authors on Pith no claims yet
classification 💻 cs.CL cs.LG
keywords parsingunsuperviseddependencysupervisedtreesapproachparseraccuracy
0
0 comments X
read the original abstract

We present a self-training approach to unsupervised dependency parsing that reuses existing supervised and unsupervised parsing algorithms. Our approach, called `iterated reranking' (IR), starts with dependency trees generated by an unsupervised parser, and iteratively improves these trees using the richer probability models used in supervised parsing that are in turn trained on these trees. Our system achieves 1.8% accuracy higher than the state-of-the-part parser of Spitkovsky et al. (2013) on the WSJ corpus.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.