Difference between revisions of "Recurrent Neural Network (RNN)"
m |
|||
| Line 1: | Line 1: | ||
[[http://www.youtube.com/results?search_query=LSTM+recurrent+Neural+Network YouTube Search]] | [[http://www.youtube.com/results?search_query=LSTM+recurrent+Neural+Network YouTube Search]] | ||
| + | *[https://deeplearning4j.org/lstm.html Guide] | ||
*[[Sequence to Sequence (Seq2Seq)]] | *[[Sequence to Sequence (Seq2Seq)]] | ||
*[[Attention Models]] | *[[Attention Models]] | ||
| Line 6: | Line 7: | ||
*[[Natural Language Inference (NLI) and Recognizing Textual Entailment (RTE)]] | *[[Natural Language Inference (NLI) and Recognizing Textual Entailment (RTE)]] | ||
*[[(Speech to) Text to Process to Text (to Speech) - Chatbot, Virtual Assistance]] | *[[(Speech to) Text to Process to Text (to Speech) - Chatbot, Virtual Assistance]] | ||
| + | |||
| + | Recurrent nets are a type of artificial neural network designed to recognize patterns in sequences of data, such as text, genomes, handwriting, the spoken word, or numerical times series data emanating from sensors, stock markets and government agencies. They are arguably the most powerful and useful type of neural network, applicable even to images, which can be decomposed into a series of patches and treated as a sequence. Since recurrent networks possess a certain type of memory, and memory is also part of the human condition, we’ll make repeated analogies to memory in the brain. | ||
<youtube>cdLUzrjnlr4</youtube> | <youtube>cdLUzrjnlr4</youtube> | ||
Revision as of 22:23, 10 May 2018
- Guide
- Sequence to Sequence (Seq2Seq)
- Attention Models
- Gradient Descent Optimization & Challenges
- Natural Language Inference (NLI) and Recognizing Textual Entailment (RTE)
- (Speech to) Text to Process to Text (to Speech) - Chatbot, Virtual Assistance
Recurrent nets are a type of artificial neural network designed to recognize patterns in sequences of data, such as text, genomes, handwriting, the spoken word, or numerical times series data emanating from sensors, stock markets and government agencies. They are arguably the most powerful and useful type of neural network, applicable even to images, which can be decomposed into a series of patches and treated as a sequence. Since recurrent networks possess a certain type of memory, and memory is also part of the human condition, we’ll make repeated analogies to memory in the brain.