Recognition: unknown
Deep Learning Embeddings for Discontinuous Linguistic Units
classification
💻 cs.CL
keywords
embeddingslearninglinguisticunitsdeepdiscontinuouswordalthough
read the original abstract
Deep learning embeddings have been successfully used for many natural language processing problems. Embeddings are mostly computed for word forms although a number of recent papers have extended this to other linguistic units like morphemes and phrases. In this paper, we argue that learning embeddings for discontinuous linguistic units should also be considered. In an experimental evaluation on coreference resolution, we show that such embeddings perform better than word form embeddings.
This paper has not been read by Pith yet.
discussion (0)
Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.