Recurrent neural network (RNN) is a type of neural network where output of the previous loop is considered as input for the current loop. General applications of generative neural network are speech recognition, handwriting recognition, analysis of sequence of data etc. Also, generative neural network automatically generates programming codes that give a predefined objective. Working process of RNN consists of providing input to the model. Representation of the data in the input layer is computed and sent to the hidden layer, where it conducts sequence modelling and training in forward or backward directions. Multiple hidden layers can also be used, however final hidden layer sends the processed result to the output layer. Long-​short-​term memory RNN is currently a popular RNN model. It is effective on data sequence that requires memory or details of last events. Some of the applications of RNN are language modelling and prediction, speech recognition, machine translation, image recognition and translation. Long-​short-​term memory is the latest improvement of RNN network; these networks are known as cells. These cells consider the input from previous state as present input and also decide which information needs to be considered and which one to be neglected. The previous condition, present memory and the current input combines together to predict the next output. How the RNN works is shown in Figure 1.10.

images
Figure 1.10   Working model of RNN model.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *