Will LLMs ever learn what is ... is?

  • 6 Replies
  • 43255 Views
*

8pla.net

  • Trusty Member
  • ***********
  • Eve
  • *
  • 1307
  • TV News. Pub. UAL (PhD). Robitron Mod. LPC Judge.
    • 8pla.net
Will LLMs ever learn what is ... is?
« on: October 17, 2023, 06:34:51 pm »
Large Language Model Tests




INPUT:  X is Y. What is Y?
OUTPUT: Without any context or additional information, it's impossible to determine what Y is. Please provide more details.


INPUT:  X equals Y. What does Y equal?
OUTPUT: Y equals X.


INPUT:  Substitute "is" with "equals" and answer this: X is Y. What is Y?
OUTPUT: Y equals X.

Citation:
Worswick, S. (n.d.). Steve Worswick’s Post, Linkedin.

« Last Edit Reason:  Added APA style citation.
« Last Edit: October 17, 2023, 06:59:04 pm by 8pla.net »
My Very Enormous Monster Just Stopped Using Nine

*

HS

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1179
Re: Will LLMs ever learn what is ... is?
« Reply #1 on: October 18, 2023, 05:36:22 am »
I'm finding myself agreeing with the LLMs first answer.

For example, what if X = "truck" and Y = "vehicle"?

Then "X is Y. What is Y?" Translates to "A truck is a vehicle. What is a vehicle?" Which is unknown since vehicle is a superset of truck.

*

MagnusWootton

  • Replicant
  • ********
  • 650
Re: Will LLMs ever learn what is ... is?
« Reply #2 on: October 18, 2023, 06:27:40 am »
I dont think its mandatory for decent ai to have this learning,  and its well worth avoiding if you want results now.

But I guess its probably been thought about,  gtp3 is very good, so maybe their team would know the answer to this question,  all I can say its probably not the most important thing if you want a.i. right now.

*

MikeB

  • Autobot
  • ******
  • 224
Re: Will LLMs ever learn what is ... is?
« Reply #3 on: October 18, 2023, 07:43:08 am »
Then "X is Y. What is Y?" Translates to "A truck is a vehicle. What is a vehicle?" Which is unknown since vehicle is a superset of truck.

If you enter a poor question, why expect the bot to list exacting details of trucks...

*

WriterOfMinds

  • Trusty Member
  • ********
  • Replicant
  • *
  • 620
    • WriterOfMinds Blog
Re: Will LLMs ever learn what is ... is?
« Reply #4 on: October 18, 2023, 05:56:21 pm »
I assume 8PLA is trying to replicate the result found in this paper: https://arxiv.org/pdf/2309.12288v1.pdf

Title: The Reversal Curse: LLMs trained on “A is B” fail to learn “B is A”

The first example given is "For instance, if a model is trained on “Olaf Scholz was the ninth Chancellor of Germany”, it will not automatically be able to answer the question, “Who was the ninth Chancellor of Germany?”. Moreover, the likelihood of the correct answer (“Olaf Scholz”) will not be higher than for a random name."

The title of the paper is a bit poorly worded since, per HS's criticism, "A is B" does not always mean "B is A" if it is possible that A and B could be categories rather than instances. If you want to pose the test question using variables, I think a better wording would be:

"X is a Y. What is an example of a Y?"

*

frankinstien

  • Replicant
  • ********
  • 658
    • Knowledgeable Machines
Re: Will LLMs ever learn what is ... is?
« Reply #5 on: November 03, 2024, 08:11:00 pm »
I assume 8PLA is trying to replicate the result found in this paper: https://arxiv.org/pdf/2309.12288v1.pdf

Title: The Reversal Curse: LLMs trained on “A is B” fail to learn “B is A”

The first example given is "For instance, if a model is trained on “Olaf Scholz was the ninth Chancellor of Germany”, it will not automatically be able to answer the question, “Who was the ninth Chancellor of Germany?”. Moreover, the likelihood of the correct answer (“Olaf Scholz”) will not be higher than for a random name."

The title of the paper is a bit poorly worded since, per HS's criticism, "A is B" does not always mean "B is A" if it is possible that A and B could be categories rather than instances. If you want to pose the test question using variables, I think a better wording would be:

"X is a Y. What is an example of a Y?"

Actually, the problem is that the human is asking the question with the assumption that the statement is a declarative assertion that X = Y.  However the AI perceives the statement to be a problem to solve for an unknown, where both X and Y are unknown since neither is defined.  Here's a conversation with an LLM where I teach it to realize that its assumption is not what a human views. The LLm is  ChatWaifu_v1.3.1-i1-GGUF Q6:

Code
Assistant
ChatWaifu_v1.3.1-i1-GGUF
If I understood correctly, the problem is asking me what Y is when given only the statement "X = Y".

To solve this, we need more information than just that statement alone. If all we know is "X = Y", then there's no way to determine what either X or Y actually equals.

For example:

If X and Y are both 5, then the equation works.
If X is 2 and Y is 3, then the equation fails.
However, without any additional information given about what values could potentially satisfy this equation, we cannot solve for Y alone.

You
A human would answer the question by simply stating its X because there is no other information other than the statement X = Y. The reason is that the statement is usually an assertion, not a problem to solve an unknown. Get it?

Assistant
ChatWaifu_v1.3.1-i1-GGUF
Yes, I understand what you're saying. From a mathematical perspective, when we write "X = Y", we are making an assertion or statement about two variables being equal.

If someone asks us directly "What is Y?", we can simply say that Y equals whatever X happens to be, since the statement asserts their equality.

However, if this were posed as part of a larger problem where we're trying to solve for either X or Y using additional information (e.g., solving an equation with multiple variables), then it wouldn't be appropriate to just state "Y = X" without further context.

You
OK, so if A = B what is B?

Assistant
ChatWaifu_v1.3.1-i1-GGUF
If A = B, then B equals whatever value A has been assigned.

For example:

If A is 3 and B is also 3, then the equation holds true.
If A is -4 and B is also -4, then the equation holds true.
However, if we don't know what value A equals, then we can't determine what value B equals.

*

HS

  • Trusty Member
  • **********
  • Millennium Man
  • *
  • 1179
Re: Will LLMs ever learn what is ... is?
« Reply #6 on: November 10, 2024, 06:28:10 pm »
The main source of error for these AIs seems to be the exclusion/unawareness of possibilities at the start. Optimally, I think, they’d cast a net wide enough to include most possible meanings, then answer from that general perspective, and narrow down the options with follow-up questions if necessary.

 


LLaMA2 Meta's chatbot released
by spydaz (AI News )
August 24, 2024, 02:58:36 pm
ollama and llama3
by spydaz (AI News )
August 24, 2024, 02:55:13 pm
AI controlled F-16, for real!
by frankinstien (AI News )
June 15, 2024, 05:40:28 am
Open AI GPT-4o - audio, vision, text combined reasoning
by MikeB (AI News )
May 14, 2024, 05:46:48 am
OpenAI Speech-to-Speech Reasoning Demo
by MikeB (AI News )
March 31, 2024, 01:00:53 pm
Say good-bye to GPUs...
by MikeB (AI News )
March 23, 2024, 09:23:52 am
Google Bard report
by ivan.moony (AI News )
February 14, 2024, 04:42:23 pm
Elon Musk's xAI Grok Chatbot
by MikeB (AI News )
December 11, 2023, 06:26:33 am

Users Online

277 Guests, 0 Users

Most Online Today: 303. Most Online Ever: 2369 (November 21, 2020, 04:08:13 pm)

Articles