Bidirectional Encoder Representations from Transformer

  • 0 Replies
  • 7204 Views
*

frankinstien

  • Starship Trooper
  • *******
  • 408
    • Knowledgeable Machines
Bidirectional Encoder Representations from Transformer
« on: August 13, 2020, 07:42:16 pm »
 Bidirectional Encoder Representations from Transformer (BERT) has become a standard building block for training task-specific NLP
models.  I have never heard of it but it is a Google creation that was published in 2018. Microsoft has developed a Biomedical NLP solution and has published a white paper about it.

Reading the paper I was just wondering how GPT-3 and 4 would do if applied to the same task?