EURASIP Journal on Audio, Speech, and Music Processing
Original Item ID
The recurrent neural network language model (RNNLM) has shown significant promise for statistical language modeling. In this work, a new class-based output layer method is introduced to further improve the RNNLM. In this method, word class information is incorporated into the output layer by utilizing the Brown clustering algorithm to estimate a class-based language model. Experimental results show that the new output layer with word clustering not only improves the convergence obviously but also reduces the perplexity and word error rate in large vocabulary continuous speech recognition.
Shi, Yongzhe; Zhang, Wei-Qiang; Liu, Jia; and Johnson, Michael T., "RNN Language Model with Word Clustering and Class-based Output Layer" (2013). Electrical and Computer Engineering Faculty Research and Publications. 44.