Difference between revisions of "Bidirectional Long Short-Term Memory (BI-LSTM)"

From
Jump to: navigation, search
Line 1: Line 1:
 +
{{#seo:
 +
|title=PRIMO.ai
 +
|titlemode=append
 +
|keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, Microsoft, Azure, Amazon, AWS
 +
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools
 +
}}
 
[http://www.youtube.com/results?search_query=Bidirectional+LSTM YouTube search...]
 
[http://www.youtube.com/results?search_query=Bidirectional+LSTM YouTube search...]
 +
[http://www.google.com/search?q=Bidirectional+LSTM+machine+learning+ML+artificial+intelligence ...Google search]
 +
 +
* [[Bidirectional Long Short-Term Memory (BI-LSTM) with Attention Mechanism]]
  
 
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.
 
The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.

Revision as of 12:55, 3 February 2019

YouTube search... ...Google search

The purpose of the Bi-LSTM is to look at a particular sequences both from front-to-back as well as from back-to-front. In this way, the network creates a context for each character in the text that depends on both its past as well as its future.