you_are_stupid666

you_are_stupid666 t1_izieh33 wrote

This is what people can’t seem to get past and certainly no one fully understands. To say consciousness is not required for intelligence is to say a pulse is not required for human life. Or molecular bonds are not required for atoms to make the earth.

I am more inclined to argue that without consciousness their is no such thing as intelligence fundamentally, than vice versa.

For example What good is solving for infinite digits of pie without a place to use such information? Consciousness is what tells you we have all of the necessary information in an answer and directs us where to go next. Intelligence is just a commodity, consciousness is what makes our thoughts indigent more than a bunch of electrons…

−3

you_are_stupid666 t1_izidlvf wrote

I couldn’t disagree with you anymore than I currently do. Mimicking behavior and consciously taking action are the same as saying life is equal to not being dead. While the action and function lol the same, the reality is they are complete opposites.

The current “AI” mimics intelligence while possessing none. It does not create anything of value until what was created has been analyzed by a human and then projected onto the world as genuine “AI”.

The singularity is the hardest part by infinity. We approach it asymptoticly and the crossing over is the part that seems forever out of reach.

Your timeline is absurd. Ten years ago we were 6 months away from full self driving cars, because we did most of the necessary work. We got 90-95% finished with the problem. The issue is the last 5-10% is many, many, many orders of magnitude harder than the other part. Staying in between lines, not hitting stuff, stopping at red lights, etc. is most of what we do while driving. Once in a blue moon, though, we blow a tire, hit an ice patch, have to avoid a deer or recognize that an intersections stop light is broken and their are two green lights and the other cars aren’t stopping so I have need to be slamming my breaks right now or I’m gonna die, or myriad other things that can go wrong.

This is where you fully understand the difference between mimicry and intelligence. Humans don’t know what they will need to react to in the future but they have the ability to handle whatever it might be. Computers are no where close to this. They have no ability to critically analyze a situation they haven’t seen before. They have no intelligence, they have a database of previous experiences that they then apply some sort of values to possible outcomes and make binary decisions along dynamic paths but without ever building new paths.

We are decades away from any singularity “event” per se. And we are many decades or more away from some existential species wide crisis over where we belong in the universe….

Not to be rude but imho this is a naive and extremely unlikely expectation for the near future.

4