Difference between revisions of "StructBERT"

From
Jump to: navigation, search
(Created page with "{{#seo: |title=PRIMO.ai |titlemode=append |keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, M...")
 
Line 9: Line 9:
  
 
* [[Natural Language Processing (NLP)]]
 
* [[Natural Language Processing (NLP)]]
 +
* [[COVID-19]]
 
* [http://iclr.cc/virtual_2020/poster_BJgQ4lSFPH.html StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding | W. Wang, B. Bi, M. Yan, C. Wu, J. Xia, Z. Bao, L. Peng and L. Si - Alibaba Group Inc.]
 
* [http://iclr.cc/virtual_2020/poster_BJgQ4lSFPH.html StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding | W. Wang, B. Bi, M. Yan, C. Wu, J. Xia, Z. Bao, L. Peng and L. Si - Alibaba Group Inc.]
  
 
incorporating language structures into pre-training. Specifically, we pre-train StructBERT with two auxiliary tasks to make the most of the sequential order of words and sentences, which leverage language structures at the word and sentence levels, respectively. As a result, the
 
incorporating language structures into pre-training. Specifically, we pre-train StructBERT with two auxiliary tasks to make the most of the sequential order of words and sentences, which leverage language structures at the word and sentence levels, respectively. As a result, the
 
new model is adapted to different levels of language understanding required by downstream tasks. [http://openreview.net/pdf?id=BJgQ4lSFPH StructBERT: Incorporating Language Structures Into Pretraining For Deep Language Understanding | W. Wang, B. Bi, M. Yan, C. Wu, J. Xia, Z. Bao, L. Peng and L. Si - Alibaba Group Inc.]
 
new model is adapted to different levels of language understanding required by downstream tasks. [http://openreview.net/pdf?id=BJgQ4lSFPH StructBERT: Incorporating Language Structures Into Pretraining For Deep Language Understanding | W. Wang, B. Bi, M. Yan, C. Wu, J. Xia, Z. Bao, L. Peng and L. Si - Alibaba Group Inc.]

Revision as of 06:17, 9 June 2020

Youtube search... | ...Google search

incorporating language structures into pre-training. Specifically, we pre-train StructBERT with two auxiliary tasks to make the most of the sequential order of words and sentences, which leverage language structures at the word and sentence levels, respectively. As a result, the new model is adapted to different levels of language understanding required by downstream tasks. StructBERT: Incorporating Language Structures Into Pretraining For Deep Language Understanding | W. Wang, B. Bi, M. Yan, C. Wu, J. Xia, Z. Bao, L. Peng and L. Si - Alibaba Group Inc.