Evaluating the Underlying Gender Bias in Contextualized Word Embeddings
2019pp. 33–39
Citations Over TimeTop 10% of 2019 papers
Abstract
Gender bias is highly impacting natural language processing applications. Word embeddings have clearly been proven both to keep and amplify gender biases that are present in current data sources. Recently, contextualized word embeddings have enhanced previous word embedding techniques by computing word vector representations dependent on the sentence they appear in.
Related Papers
- → Embedding as a modeling problem(1998)167 cited
- → Romans 12.4–8: One Sentence or Two?(2006)2 cited
- → PixelSNE: Visualizing Fast with Just Enough Precision via Pixel-Aligned Stochastic Neighbor Embedding(2016)4 cited
- The Pragmatic Analysis of the Subject of "Bei"——sentence in Dunuang Bianwen(2010)