Hidden neural networks: a framework for HMM/NN hybrids
Citations Over TimeTop 11% of 2002 papers
Abstract
This paper presents a general framework for hybrids of hidden Markov models (HMM) and neural networks (NN). In the new framework called hidden neural networks (HNN) the usual HMM probability parameters are replaced by neural network outputs. To ensure a probabilistic interpretation the HNN is normalized globally as opposed to the local normalization enforced on parameters in standard HMMs. Furthermore, all parameters in the HNN are estimated simultaneously according to the discriminative conditional maximum likelihood (CML) criterion. The HNNs show clear performance gains compared to standard HMMs on TIMIT continuous speech recognition benchmarks. On the task of recognizing five broad phoneme classes an accuracy of 84% is obtained compared to 76% for a standard HMM. Additionally, we report a preliminary result of 69% accuracy on the TIMIT 39 phoneme task.
Related Papers
- → Towards Constructing HMM Structure for Speech Recognition With Deep Neural Fenonic Baseform Growing(2021)3 cited
- → On the phonetic structure of a large hidden Markov model(1991)8 cited
- → HMM adaptation techniques in training framework(2002)1 cited
- → Discriminative Acoustic Event Recognition In Multimedia Recordings(2011)