Submitted by RadioFreeAmerika t3_122ilav in singularity
MysteryInc152 t1_jdrpjd4 wrote
Reply to comment by ecnecn in Why is maths so hard for LLMs? by RadioFreeAmerika
Sorry I'm hijacking the top comment so people will hopefully see this.
Humans learn language and concepts through sentences, and in most cases semantic understanding can be built up just fine this way. It doesn't work quite the same way for math.
When I look at any arbitrary set of numbers, I have no idea if they are prime or factors because they themselves don't have much semantic content. In order to understand whether they are those things or not actually requires to stop and perform some specific analysis on them learned through internalizing sets of rules that were acquired through a specialized learning process. Humans themselves don't learn math by just talking to one another about it, rather they actually have to do it in order to internalize it.
In other words, mathematics or arithmetic is not highly encoded in language.
The encouraging thing is that this does improve with more scale. GPT-4 is much much better than 3.5
ecnecn t1_jdruk43 wrote
Actually you can with Logic, Prolog wouldnt work otherwise. The basics of mathematics is logic equations. Propositional logic and predicative logic may express all math. rules and their application.
MysteryInc152 t1_jdruv58 wrote
I didn't say you couldn't. I said it's not highly encoded in language. Not everything that can be extracted from language can be extracted with the same ease.
ecnecn t1_jdrvfvr wrote
You are right just parts of mathematics are encoded like logic. It would need some hybrid system.
Viewing a single comment thread. View all comments