Submitted by FreshAirCoolWater t3_1168h6b in Futurology
pete_68 t1_j9albwb wrote
>...but I haven't dug into the topic much before.
>
>I think all of us should not consider AI as simple helping tools
So you haven't looked into it, but you're here to tell everyone who has how they should think about it?
And we stopped evolving intelligence when we started putting labels on things like mattresses, telling people not to smoke in bed, or hair dryers and toasters, telling people not to use them in the shower. We stopped evolving intelligence when we started enacting helmet laws and seatbelt laws.
When we protect stupid people from doing themselves in, we're bypassing survival of the fittest and dumbing down the species.
And more than anything, we probably stopped evolving intelligence when people started watching TV instead of reading books.
A study in Norway suggests that between 1962 and 1991, IQs dropped by about 3%.
A separate studies across multiple countries suggests that, between 1975 and 2020, they dropped as much as 13.5%.
AI isn't the cause of it.
These bots aren't intelligent. They're highly educated and incredibly stupid.
Me: Is the letter E in the word Red?
ChatGPT: No, the letter "E" is not in the word "Red."
Me: What letters are in the word Red
ChatGPT: The letters in the word "Red" are "R," "E," and "D."
These things are just fancy calculators. They calculate the next right word. Nothing more, nothing less. They don't do logic. They're not intelligent.
FreshAirCoolWater OP t1_j9d0rvm wrote
Your opinion on the structure of society that causes intelligence loss is pretty cool, I liked that.
I think we help "dumb" people just to prevent chaos, mainly. But of course the economy and so on need all these people to behave in a certain way for the means of money.
You said these AIs for example ChatGPT don't do logic. I def. remember that in a Joe Rogan podcast one of these experts mentioned that AIs like ChatGPT are programmed to use a certain degree of logic, the logic being executed through the principles of the code.
pete_68 t1_j9dmim6 wrote
The code uses logic. But ChatGPT doesn't understand logic and can't be logical. Again, it's not very intelligent. I cant sometimes answer logical questions correctly, but not because it's being logical, but because the logical response is what's calculated as the next correct words, because of what was in the training data. You can teach it facts, but you can't teach it logic.
Which isn't to say one day it or one of its successors won't be logical. But today, it is not.
Viewing a single comment thread. View all comments