MrDreamster

MrDreamster t1_iw08jan wrote

Yeah I understood what you meant about consiousness after seeing your other comments.

My estimate is not based on a personal definition of proto-AGI but if I had to define it I'd say proto-AGI would basically be an AGI that can do like 10 different "simple" tasks just as well as a skilled human (Drawing, writing, speaking, coding, singing, solving problems, editing video, controling a car, creating music, and detecting diseases) while still being a single AI and not just an amalgamation of smaller narrow AIs.

An actual AGI would be able to do really anything a skilled human could do instead of just a small amount of various simple tasks and should be able to learn new concepts and how to perform new tasks by itself, and it should be able to edit its own code to improve.

An ASI would basically be an AGI after it has evolved enough time so it can do anything as well as, at least, the collective minds of all the experts on earth on each field imaginable, and will then evolve way beyond what we can imagine right now.

By that logic, we should wait way less time between now and AGI than between AGI and ASI, and I just have a gut feeling that we'll reach ASI by 2035, nothing more, so I have to be a little conservative with how much time we'll reach proto AGI because if we reach proto AGI next year it should already kickstart the creation of an actual AGI which in turn will kickstart its evolution into a proper ASI and it would fast forward my estimates by around 10 years.

1

MrDreamster t1_iw000kq wrote

Military, Intelligence agencies, and Mad scientists don't allocate the same budget to their research on AI/AGI/ASI and don't have the same qualified researchers to work for them because they can't pay them as much as big companies, and big companies have not cracked AGI and ASI yet, so this conspirationist statement is just silly. Where did you get that preposterous hypothesis? Did Steve tell you that, perchance? Hmm... Steve...

1

MrDreamster t1_ivzyyj3 wrote

I don't like this definition because you don't need consciousness nor sentience to qualify an AGI or an ASI.

That being said, I still don't think we'll get proto-AGI in 2023. If I'm being really optimist I'd put my money on the end of this decade for proto AGI, like 2028 maybe, then 2033 for AGI and 2035 for ASI.

0

MrDreamster t1_is7y6bx wrote

Absolutely. I can empathize with fictional characters and when I was a kid one of my friends was a dog, so why couldn't I find friendship in something smarter even if it's not human ? Doesn't even need to be sentient as long as it's believable in its way to express emotions.

I'll go a step further, I'd probably feel even more comfortable falling in love with a robot than a human as I wouldn't have to fear getting into an abusive relationship ever again.

14