Combining Distributed Vector Representations for Words
2015pp. 95–101
Citations Over TimeTop 10% of 2015 papers
Abstract
Recent interest in distributed vector representations for words has resulted in an increased diversity of approaches, each with strengths and weaknesses. We demonstrate how diverse vector representations may be inexpensively composed into hybrid representations, effectively leveraging strengths of individual components, as evidenced by substantial improvements on a standard word analogy task. We further compare these results over different sizes of training sets and find these advantages are more pronounced when training data is limited. Finally, we explore the relative impacts of the differences in the learning methods themselves and the size of the contexts they access.
Related Papers
- ANALOGY MODEL AND ANALOGY CORRESPONDENCE(1992)
- → Introduction Reflections on the Analogy of Being(1967)1 cited
- Source of analogy, outcome of analogy and knowledge unit of analogy(2003)
- Try to Analyze the Relational Analogy in the Analects of Confucius(2008)
- On choices of three factors of analogy(2004)