Andriyo t1_je8pre9 wrote
Reply to comment by StevenVincentOne in The argument that a computer can't really "understand" things is stupid and completely irrelevant. by hey__bert
one needs a degree in mathematics to really explain why 2+2=4 (and be aware that it might not be always the case). Majority of people do exactly what LLMs are doing - just statistically infer that in the text "2+2=..." should be followed by "4"
theotherquantumjim t1_je8shkz wrote
This is not correct at all. From a young age people learn the principles of mathematics, usually through the manipulation of physical objects. They learn numerical symbols and how these connect to real-world items e.g. if I have 1 of anything and add 1 more to it I have 2. Adding 1 more each time increases the symbolic value by 1 increment. That is a rule of mathematics that we learn very young and can apply in many situations
Andriyo t1_je8uj7t wrote
There is nothing fundamental about the rule of 1 apple + 1 apple = 2 apples. It's entirely depending on our anthorpomorphic definition of what is "1" of anything is. If I add two piles of sand together, I'll get one pile of sand still.
Mathematics is our mental model for the real world. It could be super effective in its predictions but not always the case.
Kids just do what LLMs are doing. They observe that parents call any one noun + one noun equals 2 nouns. The concept of what is addition really is (with its commutative property, identity property, closing property etc) people learn much later
theotherquantumjim t1_je8ysa1 wrote
This is largely semantic trickery though. Using apples is just an easy way for children to learn the fundament that 1+1=2. Your example doesn’t really hold up since a pile of sand is not really a mathematical concept. What you are actually talking about is 1 billion grains of sand + 1 billion grains of sand. Put them together and you will definitely find 2 billion grains of sand. The fundamental mathematical principles hidden behind the language hold true
Andriyo t1_jeam834 wrote
There is nothing fundamental behind 1+1=2. It's just the language that we use to describe reality as we observe it as humans. And even beyond that, it's cultural: some tribes have "1", "2", "3", "many" math and to them it is as "fundamental" as Integer number system to us. The particular algebra of 1+1=2 was invented by humans (and some other species) because we evolutionary optimized to work with discrete objects to detect threats and such.
I know Plato believed in the existence of numbers or "Ideas" in a realm that transcended the physical world but it's not verifiable so it's just that - a belief.
So children just learn the language of numbers and arithmetic as any other language by training on examples - statistically. There might be some innate training that happened on DNA level so we're predisposition to learn about integers easier but it doesn't make "1+1=2" as something to discover that exists on its own like, say, gravity or fire.
theotherquantumjim t1_jearid2 wrote
That is one school of thought certainly. There are plenty in academia who argue that maths is fundamental
Andriyo t1_jedirnp wrote
It is certainly fundamental to our understanding of the world, but if we all forget tomorrow that 1+1 =2 and all math altogether, the world won't stop existing :)
theotherquantumjim t1_jednh7n wrote
Whilst this is correct, 1+1=2 will still be true whether there is someone to observe it or not.
Andriyo t1_jeds606 wrote
maybe it's my background in software engineering but truthiness to me is just a property that could be assigned to anything :)
say, statement 60 + 2 = 1 is also true in for people who are familiar with how we measure time.
anyway, most children do rote memorize 1+1=2, 1+2 = 3 - they even have posters with tables in school. they also show examples of "car is one","apple is one" etc. so basically what LLMs is doing. anyway, long story short LLMs is capable of doing long arithmetic if you ask it to do it step by step. The only limitation so far is the context length.
theotherquantumjim t1_jedspfh wrote
The language and the symbols are simply the tools to learn the inherent truths. You can change the symbols but the rules beneath will be the same. Doesn’t matter if one is called “one” or “zarg” or “egg”. It still means one. With regards LLMs I am very interested to see how far they can extend the context windows and if there are possibilities for long-term memory.
StevenVincentOne t1_jea66kw wrote
Your correction is correct
Viewing a single comment thread. View all comments