Difference between revisions of "Bidirectional Encoder Representations from Transformers (BERT)"

From
Jump to: navigation, search
m
m (Text replacement - "http:" to "https:")
Line 5: Line 5:
 
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
 
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools  
 
}}
 
}}
[http://www.youtube.com/results?search_query=BERT+Transformer+nlp+language Youtube search...]   
+
[https://www.youtube.com/results?search_query=BERT+Transformer+nlp+language Youtube search...]   
[http://www.google.com/search?q=BERT+Transformer+nlp+language ...Google search]
+
[https://www.google.com/search?q=BERT+Transformer+nlp+language ...Google search]
  
 
* [[Assistants]] ... [[Hybrid Assistants]]  ... [[Agents]]  ... [[Negotiation]] ... [[Langchain]]
 
* [[Assistants]] ... [[Hybrid Assistants]]  ... [[Agents]]  ... [[Negotiation]] ... [[Langchain]]
Line 15: Line 15:
 
* [[Google]]
 
* [[Google]]
 
* [[TaBERT]]
 
* [[TaBERT]]
* [http://www.theverge.com/2019/10/25/20931657/google-bert-search-context-algorithm-change-10-percent-langauge Google is improving 10 percent of searches by understanding language context - Say hello to BERT | Dieter Bohn - The Verge] ...the old [[Google]] search algorithm treated that sentence as a “[[Bag-of-Words (BoW)]]”
+
* [https://www.theverge.com/2019/10/25/20931657/google-bert-search-context-algorithm-change-10-percent-langauge Google is improving 10 percent of searches by understanding language context - Say hello to BERT | Dieter Bohn - The Verge] ...the old [[Google]] search algorithm treated that sentence as a “[[Bag-of-Words (BoW)]]”
* [http://venturebeat.com/2019/09/26/google-ais-albert-claims-top-spot-in-multiple-nlp-performance-benchmarks/ Google AI’s ALBERT claims top spot in multiple NLP performance benchmarks | Khari Johnson - VentureBeat]
+
* [https://venturebeat.com/2019/09/26/google-ais-albert-claims-top-spot-in-multiple-nlp-performance-benchmarks/ Google AI’s ALBERT claims top spot in multiple NLP performance benchmarks | Khari Johnson - VentureBeat]
 
* RoBERTa:
 
* RoBERTa:
** [http://arxiv.org/abs/1907.11692 RoBERTa: A Robustly Optimized BERT Pretraining Approach | Y. Li, M. Ott, N. Goyal, J. Du, M. Joshi, D. Chen, O. Levy, M. Lewis, L. Zettlemoyer, and V. Stoyanov]
+
** [https://arxiv.org/abs/1907.11692 RoBERTa: A Robustly Optimized BERT Pretraining Approach | Y. Li, M. Ott, N. Goyal, J. Du, M. Joshi, D. Chen, O. Levy, M. Lewis, L. Zettlemoyer, and V. Stoyanov]
** [http://github.com/pytorch/fairseq/tree/master/examples/roberta RoBERTa: A Robustly Optimized BERT Pretraining Approach | GitHub] - iterates on BERT's pretraining procedure, including training the model longer, with bigger batches over more data; removing the next sentence prediction objective; training on longer sequences; and dynamically changing the masking pattern applied to the training data.  
+
** [https://github.com/pytorch/fairseq/tree/master/examples/roberta RoBERTa: A Robustly Optimized BERT Pretraining Approach | GitHub] - iterates on BERT's pretraining procedure, including training the model longer, with bigger batches over more data; removing the next sentence prediction objective; training on longer sequences; and dynamically changing the masking pattern applied to the training data.  
** [http://venturebeat.com/2019/07/29/facebook-ais-roberta-improves-googles-bert-pretraining-methods/ [[Meta|Facebook]] AI’s RoBERTa improves Google’s BERT pretraining methods | Khari Johnson - VentureBeat]
+
** [https://venturebeat.com/2019/07/29/facebook-ais-roberta-improves-googles-bert-pretraining-methods/ [[Meta|Facebook]] AI’s RoBERTa improves Google’s BERT pretraining methods | Khari Johnson - VentureBeat]
 
* Google's BERT - built on ideas from [[ULMFiT]], [[ELMo]], and [[OpenAI]]
 
* Google's BERT - built on ideas from [[ULMFiT]], [[ELMo]], and [[OpenAI]]
 
* [[Attention]] Mechanism/[[Transformer]] Model
 
* [[Attention]] Mechanism/[[Transformer]] Model
Line 27: Line 27:
 
** [https://www.technologyreview.com/2023/02/08/1068068/chatgpt-is-everywhere-heres-where-it-came-from/ ChatGPT is everywhere. Here’s where it came from | Will Douglas Heaven - MIT Technology Review]
 
** [https://www.technologyreview.com/2023/02/08/1068068/chatgpt-is-everywhere-heres-where-it-came-from/ ChatGPT is everywhere. Here’s where it came from | Will Douglas Heaven - MIT Technology Review]
 
* [[Transformer-XL]]
 
* [[Transformer-XL]]
* [http://venturebeat.com/2019/05/16/microsoft-makes-googles-bert-nlp-model-better/ Microsoft makes Google’s BERT NLP model better | Khari Johnson - VentureBeat]  
+
* [https://venturebeat.com/2019/05/16/microsoft-makes-googles-bert-nlp-model-better/ Microsoft makes Google’s BERT NLP model better | Khari Johnson - VentureBeat]  
 
* [[Watch me Build a Finance Startup]] | [[Creatives#Siraj Raval|Siraj Raval]]  
 
* [[Watch me Build a Finance Startup]] | [[Creatives#Siraj Raval|Siraj Raval]]  
* [http://medium.com/huggingface/distilbert-8cf3380435b5 Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT | Victor Sanh - Medium]
+
* [https://medium.com/huggingface/distilbert-8cf3380435b5 Smaller, faster, cheaper, lighter: Introducing DistilBERT, a distilled version of BERT | Victor Sanh - Medium]
* [http://arxiv.org/abs/1909.10351 TinyBERT: Distilling BERT for Natural Language Understanding | X. Jiao, Y. Yin, L. Shang, X. Jiang, X. Chen, L. Li, F. Wang, and Q. Liu] researchers at Huawei produces a model called TinyBERT that is 7.5 times smaller and nearly 10 times faster than the original. It also reaches nearly the same language understanding performance as the original.  
+
* [https://arxiv.org/abs/1909.10351 TinyBERT: Distilling BERT for Natural Language Understanding | X. Jiao, Y. Yin, L. Shang, X. Jiang, X. Chen, L. Li, F. Wang, and Q. Liu] researchers at Huawei produces a model called TinyBERT that is 7.5 times smaller and nearly 10 times faster than the original. It also reaches nearly the same language understanding performance as the original.  
* [http://towardsdatascience.com/understanding-bert-is-it-a-game-changer-in-nlp-7cca943cf3ad Understanding BERT: Is it a Game Changer in NLP? | Bharat S Raj - Towards Data Science]
+
* [https://towardsdatascience.com/understanding-bert-is-it-a-game-changer-in-nlp-7cca943cf3ad Understanding BERT: Is it a Game Changer in NLP? | Bharat S Raj - Towards Data Science]
* [http://allenai.org/  Allen Institute for Artificial Intelligence, or AI2’s] [http://allenai.org/aristo/ Aristo] [http://www.geekwire.com/2019/allen-institutes-aristo-ai-program-finally-passes-8th-grade-science-test/  AI system finally passes an eighth-grade science test | Alan Boyle - GeekWire]
+
* [https://allenai.org/  Allen Institute for Artificial Intelligence, or AI2’s] [https://allenai.org/aristo/ Aristo] [https://www.geekwire.com/2019/allen-institutes-aristo-ai-program-finally-passes-8th-grade-science-test/  AI system finally passes an eighth-grade science test | Alan Boyle - GeekWire]
* [http://www.topbots.com/leading-nlp-language-models-2020/ 7 Leading Language Models for NLP in 2020 | Mariya Yao - TOPBOTS]
+
* [https://www.topbots.com/leading-nlp-language-models-2020/ 7 Leading Language Models for NLP in 2020 | Mariya Yao - TOPBOTS]
 
* [https://www.topbots.com/bert-inner-workings/ BERT Inner Workings | George Mihaila - TOPBOTS]
 
* [https://www.topbots.com/bert-inner-workings/ BERT Inner Workings | George Mihaila - TOPBOTS]
  
  
<img src="http://miro.medium.com/max/916/1*8416XWqbuR2SDgCY61gFHw.png" width="500" height="200">
+
<img src="https://miro.medium.com/max/916/1*8416XWqbuR2SDgCY61gFHw.png" width="500" height="200">
  
  
<img src="http://miro.medium.com/max/2070/1*IFVX74cEe8U5D1GveL1uZA.png" width="800" height="500">
+
<img src="https://miro.medium.com/max/2070/1*IFVX74cEe8U5D1GveL1uZA.png" width="800" height="500">
  
  
Line 57: Line 57:
 
= BERT Research | Chris McCormick =
 
= BERT Research | Chris McCormick =
  
* [http://mccormickml.com/2019/11/11/bert-research-ep-1-key-concepts-and-sources/ BERT Research | Chris McCormick]
+
* [https://mccormickml.com/2019/11/11/bert-research-ep-1-key-concepts-and-sources/ BERT Research | Chris McCormick]
* [http://www.chrismccormick.ai/ ChrisMcCormickAI] online education
+
* [https://www.chrismccormick.ai/ ChrisMcCormickAI] online education
  
<img src="http://www.mccormickml.com/assets/BERT/BERT_Mountain.png" width="700" height="400">
+
<img src="https://www.mccormickml.com/assets/BERT/BERT_Mountain.png" width="700" height="400">
  
  

Revision as of 02:12, 28 March 2023

Youtube search... ...Google search





BERT Research | Chris McCormick