pith. machine review for the scientific record. sign in

arxiv: 1508.04257 · v2 · submitted 2015-08-18 · 💻 cs.CL

Recognition: unknown

Learning Meta-Embeddings by Using Ensembles of Embedding Sets

Authors on Pith no claims yet
classification 💻 cs.CL
keywords embeddingmeta-embeddingslearningsetsdifferenttaskswordadvanced
0
0 comments X
read the original abstract

Word embeddings -- distributed representations of words -- in deep learning are beneficial for many tasks in natural language processing (NLP). However, different embedding sets vary greatly in quality and characteristics of the captured semantics. Instead of relying on a more advanced algorithm for embedding learning, this paper proposes an ensemble approach of combining different public embedding sets with the aim of learning meta-embeddings. Experiments on word similarity and analogy tasks and on part-of-speech tagging show better performance of meta-embeddings compared to individual embedding sets. One advantage of meta-embeddings is the increased vocabulary coverage. We will release our meta-embeddings publicly.

This paper has not been read by Pith yet.

discussion (0)

Sign in with ORCID, Apple, or X to comment. Anyone can read and Pith papers without signing in.