How to Generate a Good Word Embedding?
IEEE Intelligent Systems2017pp. 1–1
Citations Over TimeTop 10% of 2017 papers
Abstract
The authors analyze three critical components in training word embeddings: model, corpus, and training parameters. They systematize existing neural-network-based word embedding methods and experimentally compare them using the same corpus. They then evaluate each word embedding in three ways: analyzing its semantic properties, using it as a feature for supervised tasks, and using it to initialize neural networks. They also provide several simple guidelines for training good word embeddings.
Related Papers
- → How to Generate a Good Word Embedding(2016)316 cited
- → How to Generate a Good Word Embedding?(2017)28 cited
- → Deep Learning and Word Embeddings for Tweet Classification for Crisis\n Response(2019)21 cited
- → Deconstructing Word Embeddings(2019)1 cited
- → Dual embedding with input embedding and output embedding for better word representation(2022)