Viewing a single comment thread. View all comments

dronegoblin t1_jdr4bat wrote

LLMs don’t actually have any logical capacity, they just have seen large amounts of text and can predict what logically comes next.

There is a lot of randomness to this, and even at a temperature of 0 (most consistent output) it will still say things differently sometimes. That’s ok though, you can answer the same question in many different ways with language

Math is not like writing. There is only one option as to what comes next for math. But between not actually being able to logically reason and only having the internet as examples for math, it’s going to treat math like language. Not all the math on the internet is done the same or even correct, so it’s just combining whatever it’s seen in a way that seems logical. It can’t count, only guess though.

1