Difference between revisions of "Deep Learning"
m |
m (Text replacement - "* Conversational AI ... ChatGPT | OpenAI ... Bing | Microsoft ... Bard | Google ... Claude | Anthropic ... Perplexity ... You ... Ernie | Baidu" to "* Conversational AI ... [[C...) |
||
(3 intermediate revisions by the same user not shown) | |||
Line 20: | Line 20: | ||
[https://www.bing.com/news/search?q=ai+Deep+Learning+Technique+Model&qft=interval%3d%228%22 ...Bing News] | [https://www.bing.com/news/search?q=ai+Deep+Learning+Technique+Model&qft=interval%3d%228%22 ...Bing News] | ||
− | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] | + | * [[What is Artificial Intelligence (AI)? | Artificial Intelligence (AI)]] ... [[Generative AI]] ... [[Machine Learning (ML)]] ... [[Deep Learning]] ... [[Neural Network]] ... [[Reinforcement Learning (RL)|Reinforcement]] ... [[Learning Techniques]] |
+ | * [[Conversational AI]] ... [[ChatGPT]] | [[OpenAI]] ... [[Bing/Copilot]] | [[Microsoft]] ... [[Gemini]] | [[Google]] ... [[Claude]] | [[Anthropic]] ... [[Perplexity]] ... [[You]] ... [[phind]] ... [[Ernie]] | [[Baidu]] | ||
* [[Other Challenges]] in Artificial Intelligence | * [[Other Challenges]] in Artificial Intelligence | ||
* [[Neural Network#Deep Neural Network (DNN)|Deep Neural Network (DNN)]] | * [[Neural Network#Deep Neural Network (DNN)|Deep Neural Network (DNN)]] | ||
Line 29: | Line 30: | ||
* [[Hierarchical Temporal Memory (HTM)]] | * [[Hierarchical Temporal Memory (HTM)]] | ||
* [[Deep Features]] | * [[Deep Features]] | ||
+ | * [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ...[[Softmax]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]] | ||
* [https://medium.com/@gokul_uf/the-anatomy-of-deep-learning-frameworks-46e2a7af5e47 The Anatomy of Deep Learning Frameworks | Gokula Krishnan Santhanam] | * [https://medium.com/@gokul_uf/the-anatomy-of-deep-learning-frameworks-46e2a7af5e47 The Anatomy of Deep Learning Frameworks | Gokula Krishnan Santhanam] | ||
* [https://pathmind.com/wiki/data-for-deep-learning Data for Deep Learning | Chris Nicholson - A.I. Wiki pathmind] | * [https://pathmind.com/wiki/data-for-deep-learning Data for Deep Learning | Chris Nicholson - A.I. Wiki pathmind] |
Latest revision as of 10:35, 16 March 2024
YouTube ... Quora ...Google search ...Google News ...Bing News
- Artificial Intelligence (AI) ... Generative AI ... Machine Learning (ML) ... Deep Learning ... Neural Network ... Reinforcement ... Learning Techniques
- Conversational AI ... ChatGPT | OpenAI ... Bing/Copilot | Microsoft ... Gemini | Google ... Claude | Anthropic ... Perplexity ... You ... phind ... Ernie | Baidu
- Other Challenges in Artificial Intelligence
- Deep Neural Network (DNN)
- Hierarchical Temporal Memory (HTM)
- Deep Features
- Backpropagation ... FFNN ... Forward-Forward ... Activation Functions ...Softmax ... Loss ... Boosting ... Gradient Descent ... Hyperparameter ... Manifold Hypothesis ... PCA
- The Anatomy of Deep Learning Frameworks | Gokula Krishnan Santhanam
- Data for Deep Learning | Chris Nicholson - A.I. Wiki pathmind
- Neuroscience News - Deep Learning
Deep learning models are vaguely inspired by information processing and communication patterns in biological nervous systems yet have various differences from the structural and functional properties of biological brains, which make them incompatible with neuroscience evidences. “Deep Learning is an algorithm which has no theoretical limitations of what it can learn; the more data you give and the more computational time you provide, the better it is” Learning Multiple Layers of Representation | Geoffrey Hinton