hducug
hducug t1_j9nq6ku wrote
Reply to Can someone fill me in? by [deleted]
When computers get smarter than us they can make themselves smarter much better and faster than us. So that smarter generations can make an even smarter generation and that even smarter generation can make an even smarter generation and so on. So eventually you’ll have an ai who is so smart that it’s basically god and you can do anything with it. That is why it’s called the singularity.
hducug t1_j9ejw8a wrote
Absolutely, people have no idea how huge the singularity will be. Although since chatgpt a lot more people became aware of the capabilities of ai.
hducug t1_j8rzz81 wrote
It’s doesn’t actually have the feelings of a 14yr old, it’s imitating them. The ai is trained by reading all of the internet. It’s response is basically the most average human response on the internet. It really gets to show what age is dominant on the internet or at least how people behave themselves.
hducug t1_j6olh2v wrote
Reply to comment by turbospeedsc in What jobs will be one of the last remaining ones? by MrCensoredFace
Nah man, the whole idea of the ai singularity is that ai can do the jobs of scientists better.
hducug t1_j6jxnhe wrote
The STEM jobs
hducug t1_j6875r5 wrote
Reply to I don't see why AGI would help us by TheOGCrackSniffer
Where does it get the motivation to do so?
hducug t1_j47wwcl wrote
Reply to Should AI receive a salary by flaming_dortos
What is ai going to do with money? I’d recommend you Google what an ai actually is. It’s a computer which solves logical problems. An ai has precisely 0 emotions. No happiness no sadness not angry not scared. An ai has 0 motivation to do something with money.
hducug t1_j3hgim0 wrote
Great, but what does this have to do with the artificial intelligence singularity? This is more of a r/tech or r/futurology topic. It’s like showing jwst photo’s on r/biology. People want to see ai development/entertainment instead of healthcare development on this sub. This would only be a good post when ai had played a big role in this.
hducug t1_j28ed24 wrote
Reply to Is AGI really achievable? by Calm_Bonus_6464
Who cares if we don’t know how neurons work. The ai’s still show a form of intelligence, that’s the whole point. Give Codex a code problem and it can fix that with it’s problem solving skills.
Quantum computers is probably our best chance of achieving AGI. Those computers will be waaaaaaaay more powerful in 10yrs than classical computers. I assume your comparing classical computers with the human brain.
Without conscious no intelligence? So an ai that can beat you in chess with it’s incredible chess problem solving skills is not intelligent?
Problem solving is the entire point of intelligence. Sorry but your hypotheses really lacks logic.
hducug t1_izb9t3y wrote
Reply to comment by Adorable-Effective-2 in STEM careers: What are the less likely to be replaced by AI? by jazzmess
Yes, that is already happening today. What I meant was that the jobs which are the hardest will last the longest because the ai isn’t smart enough to do them. Like inventing or discovering stuff.
hducug t1_izawztm wrote
All the stem field jobs will still exist by 5-10 years. The singularity won’t happen by than. I guess the most complicated jobs in the stem field will be the last to get replaced, it makes sense to me.
hducug t1_iz64cu3 wrote
Reply to comment by [deleted] in What are your predictions for 2023? How did your predictions for 2022 turn out? by Foundation12a
Ok, that has got to be the dumbest response I’ve ever read.
hducug t1_iz63ak3 wrote
Reply to comment by [deleted] in What are your predictions for 2023? How did your predictions for 2022 turn out? by Foundation12a
That has got to be the dumbest argument I’ve ever read.
hducug t1_iz22ggk wrote
Reply to comment by PolymorphismPrince in What are your predictions for 2023? How did your predictions for 2022 turn out? by Foundation12a
The human brain has something called logic, which the language models don’t have. Logic is literally what intelligence is all about. It doesn’t matter that prediction models work the same as our brain, it has nothing to do with gpt-4 being intelligent.
hducug t1_iz1idpl wrote
Reply to comment by Anomia_Flame in What are your predictions for 2023? How did your predictions for 2022 turn out? by Foundation12a
What does that have to do with anything? I’m just stating a fact that gpt-4 doesn’t have thinking capacity and not an iq. It just a language model that creates text which it learned from a large variety of data like books, Wikipedia, web articles etc. Is this really all you have to say?
Ps: I can’t believe this community is so childish to downvote me because i crushed their little optimism ego. Some of y’all really are just some npc’s with no thinking capacity sometimes, a lot like gpt-4 actually.
hducug t1_iyzrg3f wrote
Reply to comment by Sashinii in What are your predictions for 2023? How did your predictions for 2022 turn out? by Foundation12a
Gpt-4 has nothing to do with a general intelligence. It’s just a language model that predicts what to say and generates text based on that. Not a problem solving ai. It can’t get a 100 iq score on an iq test.
hducug t1_ixh0kjc wrote
Reply to what does this sub think of Elon Musk by [deleted]
Elon musk founded OpenAI btw. Also things like self driving cars and Neuralink is huge for ai. Spacex is revolutionizing space travel. Tesla also made electric cars very populair. So i don’t know what these people are smoking that say he is not important for technological progress
hducug t1_ixa0xg2 wrote
Reply to comment by [deleted] in How much time until it happens? by CookiesDeathCookies
A lot, but it’s mostly with things like machine learning and none are with an agi.
hducug t1_ix9vj73 wrote
Reply to comment by [deleted] in How much time until it happens? by CookiesDeathCookies
Gpt-4 is a text generator. It predicts what to say when you tell it questions by studying a lot of human text. It is not a problem solving agi, just an ai that you can have a conversation with.
hducug t1_ix9urop wrote
Reply to How much time until it happens? by CookiesDeathCookies
3 years ago I thought 2040 would be a possibility, but because of the incredible amount of progress we made in the last 3 years I think something like 2033.
hducug t1_ix8xahy wrote
Reply to Would like to say that this subreddit's attitude towards progress is admirable and makes this sub better than most other future related discussion hubs by Foundation12a
Cause the singularity is the most awesome thing that will ever happen to humanity and we are all hyped about it and get happy thinking about it.
hducug t1_ix8vq0p wrote
For as long as they want. If you have a super intelligent agi it could make a plan to keep it a secret. It’s stupid to think that you could outsmart a super intelligent agi.
hducug t1_iwzwy3a wrote
Reply to 2023 predictions by ryusan8989
I guess gpt-4 will be the biggest thing in ai next year.
hducug t1_ja2vg86 wrote
Reply to How Far to the Technological Singularity? by FC4945
Morons out here honestly saying 2025-2030💀💀💀