RNNs
  • Recurrent Neural Networks and LSTM
  • Forecasting
  • Sequence Data
  • Recurrent Neural Networks
    • Architecture of RNN
    • Training an RNN
    • Vanishing Gradient
  • Long Short Term Memory (LSTM)
    • Architecture of LSTM
    • Walk through of LSTM
  • Gated Recurrent Unit
    • GRU
    • Architecture of GRU
  • Activation Functions
    • Various Activation Functions
  • Regularization of Neural Networks
    • Various Methods in Regularization
  • Applications of RNNs
    • Various Applications of RNN
    • RNNs Movie
  • Forecasting Stock Trends using RNNs
    • Objective
    • Alpha Vantage API
    • Stock Trends Prediction using LSTM.
    • Bidirectional RNN
    • Stock Trends Prediction using Bidirectional RNNs
  • Conclusion
    • Conclusion
    • References
  • License
    • License
Powered by GitBook
On this page

Was this helpful?

  1. Forecasting Stock Trends using RNNs

Bidirectional RNN

PreviousStock Trends Prediction using LSTM.NextStock Trends Prediction using Bidirectional RNNs

Last updated 5 years ago

Was this helpful?

In simple terms, a bidirectional RNN is combining two RNNS. It connects two hidden layers of opposite directions to the same output.

  • A Bidirectional RNN considers all input available sequences in both the past and future for estimation of output vector.

  • One RNN process the sequence from start to end in a forward time direction and another one process the sequence from backwards from end to start in a negative time direction.

  • Outputs from forward states are not connected to inputs of backward states and vice versa too.

BRNNs are especially useful when the input context is needed. For example, in handwriting recognition, the performance can be improved if we know the knowledge of the letters before and after the current letter.

Architecture :

alt text

An additional hidden layer is added that passes the information backwards. The input is fed in normal order for one network and in reverse order for the other network. At every time stamp, the outputs of the two networks are concatenated.