Bidirectional Encoder Representations from Transformer

  • 0 Replies
  • 19395 Views
*

frankinstien

  • Replicant
  • ********
  • 642
    • Knowledgeable Machines
Bidirectional Encoder Representations from Transformer
« on: August 13, 2020, 07:42:16 pm »
 Bidirectional Encoder Representations from Transformer (BERT) has become a standard building block for training task-specific NLP
models.  I have never heard of it but it is a Google creation that was published in 2018. Microsoft has developed a Biomedical NLP solution and has published a white paper about it.

Reading the paper I was just wondering how GPT-3 and 4 would do if applied to the same task?

 


OpenAI Speech-to-Speech Reasoning Demo
by ivan.moony (AI News )
Today at 01:31:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

315 Guests, 0 Users

Most Online Today: 343. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles