Difference between revisions of "Bidirectional Long Short-Term Memory (BI-LSTM)"
Line 9: | Line 9: | ||
* [[Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism]] | * [[Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism]] | ||
+ | * [[Recurrent Neural Network (RNN)]] | ||
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future. | The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future. |
Revision as of 19:00, 30 June 2019
YouTube search... ...Google search
- Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism
- Recurrent Neural Network (RNN)
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.