Difference between revisions of "Bidirectional Long Short-Term Memory (BI-LSTM)"
Line 3: | Line 3: | ||
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future. | The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future. | ||
− | <youtube> | + | <youtube>uRFegQXnY54</youtube> |
− |
Revision as of 15:02, 22 October 2018
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.