Difference between revisions of "Bidirectional Long Short-Term Memory (BI-LSTM)"

From
Jump to: navigation, search
Line 8: Line 8:
 
[http://www.google.com/search?q=Bidirectional+LSTM+machine+learning+ML+artificial+intelligence ...Google search]
 
[http://www.google.com/search?q=Bidirectional+LSTM+machine+learning+ML+artificial+intelligence ...Google search]
  
* [[Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism]]
+
* [[Recurrent Neural Network (RNN)]] Variants:
* [[Recurrent Neural Network (RNN)]]
+
** [[Long Short-Term Memory (LSTM)]]
 +
** [[Gated Recurrent Unit (GRU)]]
 +
** Bidirectional Long Short-Term Memory (BI-LSTM)
 +
** [[Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism]]  
 +
** [[Average-Stochastic Gradient Descent (SGD) Weight-Dropped LSTM (AWD-LSTM)]]
  
 
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.
 
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.

Revision as of 12:01, 11 June 2020

YouTube search... ...Google search

The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.