How to get fixed size BERT representations for sentences

  • 3 Replies
  • 1023 Views
*

hainaa

  • Roomba
  • *
  • 4
How to get fixed size BERT representations for sentences
« on: May 05, 2019, 01:52:57 pm »
Hi,

I am trying to get fixed sized bert representation for all the sentences. However I am getting fixed size bert representation for words only.

If I have 7 words in a sentence and size of bert representation corresponding to a word is 768, then size of bert reoresentation for my sentence = 7* 768 which is variable and depends on number of words in the sentence.

Can anyone please help me how can I get fixed size bert representations for sentences.

Thank You,

*

goaty

  • Trusty Member
  • ********
  • Replicant
  • *
  • 552
Re: How to get fixed size BERT representations for sentences
« Reply #1 on: May 05, 2019, 04:15:41 pm »
I've never heard of BERT before,  but I had the idea that when your reading you store both sides of the surrounds of your token, and when playing back you only use the left side,  and its better for collection of both overloads+synonyms!   its because language can happen in presupposition or postsupposition, when it comes to meaning,  and if your only getting the history your missing out on useful data on the other side!

overloading = one word means two different things.
synonym    = two different words mean the same thing.

they are opposites of each other,  and only count as pointless difference reduction in my bible, but if its better why not add it.

One more thing to say,  is if you ever collect a synonym,  keep it as the exact same index,  but then on the side learn context of how to use it,  but it still counts as dead the same.  :)  because u want to keep the learning acceleration ur going to get,  don't throw that away.

I=me
are=is...  but you might as well keep the basic are you of English,  but you can still retain the compounding your going to get.
« Last Edit: May 05, 2019, 05:57:55 pm by goaty »

*

Art

  • At the end of the game, the King and Pawn go into the same box.
  • Trusty Member
  • **********************
  • Colossus
  • *
  • 5865
Re: How to get fixed size BERT representations for sentences
« Reply #2 on: May 06, 2019, 03:06:33 am »
Just putting this in with Goaty's remarks, there are routines that can count words or sentences used in a paragraph or story, some by watching the punctuation (exclamation, question mark or period). How or why you are using this 768 representation per word is beyond my pay grade but there should be a formula to count the total words * 768 or applied to the sentences, etc.

Perhaps if you were to explain the purpose or pertinence of that number, if might shed more light for those wishing to help.
In the world of AI, it's the thought that counts!

*

goaty

  • Trusty Member
  • ********
  • Replicant
  • *
  • 552
Re: How to get fixed size BERT representations for sentences
« Reply #3 on: May 06, 2019, 12:52:00 pm »
Yes theres lots of word counting in it.  8)

 


OpenAI Speech-to-Speech Reasoning Demo
by ivan.moony (AI News )
March 28, 2024, 01:31:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am
Nvidia Hype
by 8pla.net (AI News )
December 06, 2023, 10:04:52 pm
How will the OpenAI CEO being Fired affect ChatGPT?
by 8pla.net (AI News )
December 06, 2023, 09:54:25 pm
Independent AI sovereignties
by WriterOfMinds (AI News )
November 08, 2023, 04:51:21 am
LLaMA2 Meta's chatbot released
by 8pla.net (AI News )
October 18, 2023, 11:41:21 pm

Users Online

285 Guests, 0 Users

Most Online Today: 363. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles