word2vec: Distributed Representations of Words
2020
Citations Over Time
Abstract
Learn vector representations of words by continuous bag of words and skip-gram implementations of the 'word2vec' algorithm. The techniques are detailed in the paper "Distributed Representations of Words and Phrases and their Compositionality" by Mikolov et al. (2013), available at <doi:10.48550/arXiv.1310.4546>.
Related Papers
- → Support vector machines and Word2vec for text classification with semantic features(2015)457 cited
- → Review on Word2Vec Word Embedding Neural Net(2020)73 cited
- → Analyzing Semantic Relations of Word Vectors trained by The Word2vec Model(2019)1 cited
- Susquehanna Chorale Spring Concert "Roots and Wings"(2017)
- → ИСПОЛЬЗОВAНИЕ ПОТЕНЦИAЛA СОЦИAЛЬНЫХ ПAРТНЕРОВ В ПОДГОТОВКЕ БУДУЩИХ ПЕДAГОГОВ(2024)