Home » TensorFlow | Training of RNN

TensorFlow | Training of RNN

by Online Tutorials Library

Training of RNN in TensorFlow

Recurrent neural networks are a type of deep learning-oriented algorithm, which follows a sequential approach. In neural networks, we assume that each input and output of all layers is independent. These types of neural networks are called recurrent because they sequentially perform mathematical computations.

Training of RNN in TensorFlow

The following steps to train a recurrent neural network:

Step 1- Input a specific example from the dataset.

Step 2- The network will take an example and compute some calculations using randomly initialized variables.

Step 3- A predicted result is then computed.

Step 4- The comparison of the actual results generated with the expected value will produce an error.

Step 5- It is propagated through the same path where the variable is also adjusted to trace the error.

Step 6- The levels from 1 to 5 are repeated until we are confident that the variables declared to get the output are appropriately defined.

Step 7- In the last step, a systematic prediction is made by applying these variables to get new unseen input.

The schematic approach of representing recurrent neural network is described below-

Training of RNN in TensorFlow

Recurrent Neural Network Implementation with TensorFlow

Complete code

Output:

Instructions for updating:  Future major versions of TensorFlow will allow gradients to flow  into the label's input on backprop by default.    See `tf.nn.softmax_cross_entropy_with_logits_v2`.    Step 1, Minibatch Loss= 2.6592, Training Accuracy= 0.148  Step 200, Minibatch Loss= 2.1379, Training Accuracy= 0.250  Step 400, Minibatch Loss= 1.8860, Training Accuracy= 0.445  Step 600, Minibatch Loss= 1.8542, Training Accuracy= 0.367  Step 800, Minibatch Loss= 1.7489, Training Accuracy= 0.477  Step 1000, Minibatch Loss= 1.6399, Training Accuracy= 0.492  Step 1200, Minibatch Loss= 1.4379, Training Accuracy= 0.570  Step 1400, Minibatch Loss= 1.4319, Training Accuracy= 0.500  Step 1600, Minibatch Loss= 1.3899, Training Accuracy= 0.547  Step 1800, Minibatch Loss= 1.3563, Training Accuracy= 0.570  Step 2000, Minibatch Loss= 1.2134, Training Accuracy= 0.617  Step 2200, Minibatch Loss= 1.2582, Training Accuracy= 0.609  Step 2400, Minibatch Loss= 1.2412, Training Accuracy= 0.578  Step 2600, Minibatch Loss= 1.1655, Training Accuracy= 0.625  Step 2800, Minibatch Loss= 1.0927, Training Accuracy= 0.656  Step 3000, Minibatch Loss= 1.2648, Training Accuracy= 0.617  Step 3200, Minibatch Loss= 0.9734, Training Accuracy= 0.695  Step 3400, Minibatch Loss= 0.8705, Training Accuracy= 0.773  Step 3600, Minibatch Loss= 1.0188, Training Accuracy= 0.680  Step 3800, Minibatch Loss= 0.8047, Training Accuracy= 0.719  Step 4000, Minibatch Loss= 0.8417, Training Accuracy= 0.758  Step 4200, Minibatch Loss= 0.8516, Training Accuracy= 0.703  Step 4400, Minibatch Loss= 0.8496, Training Accuracy= 0.773  Step 4600, Minibatch Loss= 0.9925, Training Accuracy= 0.719  Step 4800, Minibatch Loss= 0.6316, Training Accuracy= 0.812  Step 5000, Minibatch Loss= 0.7585, Training Accuracy= 0.750  Step 5200, Minibatch Loss= 0.6965, Training Accuracy= 0.797  Step 5400, Minibatch Loss= 0.7134, Training Accuracy= 0.836  Step 5600, Minibatch Loss= 0.6509, Training Accuracy= 0.812  Step 5800, Minibatch Loss= 0.7797, Training Accuracy= 0.750  Step 6000, Minibatch Loss= 0.6225, Training Accuracy= 0.859  Step 6200, Minibatch Loss= 0.6776, Training Accuracy= 0.781  Step 6400, Minibatch Loss= 0.6090, Training Accuracy= 0.781  Step 6600, Minibatch Loss= 0.5446, Training Accuracy= 0.836  Step 6800, Minibatch Loss= 0.6514, Training Accuracy= 0.750  Step 7000, Minibatch Loss= 0.7421, Training Accuracy= 0.758  Step 7200, Minibatch Loss= 0.5114, Training Accuracy= 0.844  Step 7400, Minibatch Loss= 0.5999, Training Accuracy= 0.844  Step 7600, Minibatch Loss= 0.5764, Training Accuracy= 0.789  Step 7800, Minibatch Loss= 0.6225, Training Accuracy= 0.805  Step 8000, Minibatch Loss= 0.4691, Training Accuracy= 0.875  Step 8200, Minibatch Loss= 0.4859, Training Accuracy= 0.852  Step 8400, Minibatch Loss= 0.5820, Training Accuracy= 0.828  Step 8600, Minibatch Loss= 0.4873, Training Accuracy= 0.883  Step 8800, Minibatch Loss= 0.5194, Training Accuracy= 0.828  Step 9000, Minibatch Loss= 0.6888, Training Accuracy= 0.820  Step 9200, Minibatch Loss= 0.6094, Training Accuracy= 0.812  Step 9400, Minibatch Loss= 0.5852, Training Accuracy= 0.852  Step 9600, Minibatch Loss= 0.4656, Training Accuracy= 0.844  Step 9800, Minibatch Loss= 0.4595, Training Accuracy= 0.875  Step 10000, Minibatch Loss= 0.4404, Training Accuracy= 0.883  Optimization Finished!  Testing Accuracy: 0.890625  

Next TopicTypes of RNN

You may also like