Submitted by RadioFreeAmerika t3_122ilav in singularity
ArcticWinterZzZ t1_jdt0urg wrote
Reply to comment by Ok_Faithlessness4197 in Why is maths so hard for LLMs? by RadioFreeAmerika
I don't think that's impossible to add. You are right: chain of thought prompting circumvents this issue. I am specifically referring to "mental math" multiplication, which GPT-4 will often attempt.
liqui_date_me t1_jdt531o wrote
You would think that GPT would have discovered a general purpose way to multiply numbers, but it really hasn’t, and it isn’t accurate even with chain-of-thought prompting.
I just asked GPT4 to solve this: 87176363 times 198364
The right answer should be 17292652070132 according to wolfram alpha.
According to GPT4 the answer is 17,309,868,626,012.
This is the prompt I used:
What is 87176363 times 198364? Think of the problem step by step and give me an exact answer.
ArcticWinterZzZ t1_jdtlkru wrote
Even if it were to perform the addition manually, addition takes place in the opposite order that GPT-4 thinks. It's unlikely to be very good at it.
Viewing a single comment thread. View all comments