RNNs
  • Recurrent Neural Networks and LSTM
  • Forecasting
  • Sequence Data
  • Recurrent Neural Networks
    • Architecture of RNN
    • Training an RNN
    • Vanishing Gradient
  • Long Short Term Memory (LSTM)
    • Architecture of LSTM
    • Walk through of LSTM
  • Gated Recurrent Unit
    • GRU
    • Architecture of GRU
  • Activation Functions
    • Various Activation Functions
  • Regularization of Neural Networks
    • Various Methods in Regularization
  • Applications of RNNs
    • Various Applications of RNN
    • RNNs Movie
  • Forecasting Stock Trends using RNNs
    • Objective
    • Alpha Vantage API
    • Stock Trends Prediction using LSTM.
    • Bidirectional RNN
    • Stock Trends Prediction using Bidirectional RNNs
  • Conclusion
    • Conclusion
    • References
  • License
    • License
Powered by GitBook
On this page

Was this helpful?

  1. Regularization of Neural Networks

Various Methods in Regularization

PreviousVarious Activation FunctionsNextVarious Applications of RNN

Last updated 5 years ago

Was this helpful?

Regularization refers to controlling the capacity of neural network and prevent it from overfitting. For a better training of RNN, we seperate some part of training dataset to validation dataset. The validation set is used to watch the training process and prevent the network from underfitting or overfitting. Overfitting refers to the difference between training loss and validation loss.

  • The L1 and L2 regularization method add a regularization term to a loss function to penalize certain parameter configuration and prevent the coefficients from fitting so perfectly to the training data which leads to overfitting

  • Dropout - In general, dropout randomly omits a fraction of the connections between two layers of the network during training.Based on the drop out ratio, we randomly choose a set of neurons to deactivate and this helps in regularizing and it is analogous to feature selection in Machine learning paradigm

  • A dropout specifically tailored to RNNs is called

  • For additional regularization techniques please refer

RNNDrop
here