Dynamic Meta-Embeddings for Improved Sentence Representations
2018pp. 1466–1477
Citations Over TimeTop 1% of 2018 papers
Abstract
While one of the first steps in many NLP systems is selecting what pre-trained word embeddings to use, we argue that such a step is better left for neural networks to figure out by themselves. To that end, we introduce dynamic meta-embeddings, a simple yet effective method for the supervised learning of embedding ensembles, which leads to stateof-the-art performance within the same model class on a variety of tasks. We subsequently show how the technique can be used to shed new light on the usage of word embeddings in NLP systems.
Related Papers
- → How to Generate a Good Word Embedding(2016)316 cited
- → How to Generate a Good Word Embedding?(2017)28 cited
- → Deep Learning and Word Embeddings for Tweet Classification for Crisis\n Response(2019)21 cited
- → Deconstructing Word Embeddings(2019)1 cited
- → Dual embedding with input embedding and output embedding for better word representation(2022)