2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2011
DOI: 10.1109/icassp.2011.5947611
|View full text |Cite
|
Sign up to set email alerts
|

Extensions of recurrent neural network language model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

4
3,117
0
27

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 3,120 publications
(3,171 citation statements)
references
References 13 publications
4
3,117
0
27
Order By: Relevance
“…Convolution neural networks take advantage of existing local structures and share weights, which can dramatically reduce over-fitting problems that occur in fully connected networks. For the sequence data of acoustics (Sak et al, 2014) and language (Mikolov et al, 2010;Vinyals et al, 2015), models with recurrent structures have brought significant improvements to state-of-the-art performance. They have introduced chain-like loop structures into recurrent neural networks (Funahashi and Nakamura, 1993) as shown in Figs.…”
Section: Trends In the Development Of Ai Technology Applications Formentioning
confidence: 99%
“…Convolution neural networks take advantage of existing local structures and share weights, which can dramatically reduce over-fitting problems that occur in fully connected networks. For the sequence data of acoustics (Sak et al, 2014) and language (Mikolov et al, 2010;Vinyals et al, 2015), models with recurrent structures have brought significant improvements to state-of-the-art performance. They have introduced chain-like loop structures into recurrent neural networks (Funahashi and Nakamura, 1993) as shown in Figs.…”
Section: Trends In the Development Of Ai Technology Applications Formentioning
confidence: 99%
“…The main idea is to understand a word not as a separate entity with no specific relation to other words, but to see words as points in a finite dimensional (separable) metric space. The recurrent neural networks -using the similar principle -were successfully introduced to the field of language modelling by T. Mikolov [5] and have become widely used language modelling technique.…”
Section: Recurrent Neural Network Language Modelmentioning
confidence: 99%
“…In last years, recurrent neural networks (RNN) have attracted attention among other types of language models (LM) caused by their better performance [5] and their ability to learn on a smaller corpus than conventional n-gram models. Nowadays, they are considered as a state-of-the-art, especially the Longshort Term Memory (LSTM) variant.…”
Section: Introductionmentioning
confidence: 99%
“…In Mikolov et al [26] a recurrent neural network (RNN) [24] is used to build a neural language model. The RNN model encode the context word by word and predict the next word.…”
Section: Word Embedding-based Modelsmentioning
confidence: 99%