Difference between revisions of "Bidirectional Long Short-Term Memory (BI-LSTM)"

From
Jump to: navigation, search
Line 14: Line 14:
 
** [[Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism]]  
 
** [[Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism]]  
 
** [[Average-Stochastic Gradient Descent (SGD) Weight-Dropped LSTM (AWD-LSTM)]]
 
** [[Average-Stochastic Gradient Descent (SGD) Weight-Dropped LSTM (AWD-LSTM)]]
 +
** [[Hopfield Network (HN)]]
  
 
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.
 
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.

Revision as of 12:15, 11 June 2020

YouTube search... ...Google search

The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.