RNNs
  • Recurrent Neural Networks and LSTM
  • Forecasting
  • Sequence Data
  • Recurrent Neural Networks
    • Architecture of RNN
    • Training an RNN
    • Vanishing Gradient
  • Long Short Term Memory (LSTM)
    • Architecture of LSTM
    • Walk through of LSTM
  • Gated Recurrent Unit
    • GRU
    • Architecture of GRU
  • Activation Functions
    • Various Activation Functions
  • Regularization of Neural Networks
    • Various Methods in Regularization
  • Applications of RNNs
    • Various Applications of RNN
    • RNNs Movie
  • Forecasting Stock Trends using RNNs
    • Objective
    • Alpha Vantage API
    • Stock Trends Prediction using LSTM.
    • Bidirectional RNN
    • Stock Trends Prediction using Bidirectional RNNs
  • Conclusion
    • Conclusion
    • References
  • License
    • License
Powered by GitBook
On this page

Was this helpful?

  1. Long Short Term Memory (LSTM)

Architecture of LSTM

PreviousVanishing GradientNextWalk through of LSTM

Last updated 5 years ago

Was this helpful?

LSTMs are special kind of RNNs, designed explicitly to overcome the long term dependency problem. The repeating modules of neural networks in LSTM have a complex structure compared to that of RNN. The repeating module in LSTM has four interacting layers

  • Memory cell

  • Forget gate

  • Input gate

  • Output gate

alt text
  • Memory/Cell-state - This is represented as the horizontal layer running through the top of the diagram. It has some linear transformations such as pointwise multiplication and addition

  • Gates - These are used to optionally let information through using sigmoid neural network layers, which outputs in either 0 or 1, representing the information which we want to leave or the information we want to persist.

Now lets familiarize with the notations

alt text