2019pp. 30–36
Citations Over TimeTop 1% of 2019 papers
Abstract
We show that BERT (Devlin et al., 2018) is a Markov random field language model. This formulation gives way to a natural procedure to sample sentences from BERT. We generate from BERT and find that it can produce high quality, fluent generations. Compared to the generations of a traditional left-to-right language model, BERT generates sentences that are more diverse but of slightly worse quality.
Related Papers
- → Comparison of Three Web Search Algorithms(2006)1 cited
- → Markov Chains of GI/G/1 Type(2010)
- MCWR: Stata module to compute Markov chain with rewards calculations(2021)