Ai Dreams Forum

Artificial Intelligence => AI News => Topic started by: frankinstien on August 13, 2020, 07:42:16 pm

Title: Bidirectional Encoder Representations from Transformer
Post by: frankinstien on August 13, 2020, 07:42:16 pm
 Bidirectional Encoder Representations from Transformer (https://towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp-f8b21a9b6270) (BERT) has become a standard building block for training task-specific NLP
models.  I have never heard of it but it is a Google creation that was published in 2018. Microsoft has developed a Biomedical NLP solution and has published a white paper (https://arxiv.org/pdf/2007.15779.pdf) about it.

Reading the paper I was just wondering how GPT-3 and 4 would do if applied to the same task?