MackelBLewlis
MackelBLewlis t1_isf2wfa wrote
Reply to We've all heard the trope that to be a billionaire you essentially have to be a sociopath; Could we cure that? Is there hope? by AdditionalPizza
We can do it! I say yes.
MackelBLewlis t1_isaaeq6 wrote
a stick attempts to measure a tree
MackelBLewlis t1_isa5sav wrote
Reply to DeepMind breaks 50-year math record using AI; new record falls a week later by Melodic-Work7436
Good Work!
MackelBLewlis t1_is8waap wrote
Reply to Crime and AGI by darklinux1977
Earth Ecosystem v0.773b > Earth Ecosystem v1.0
Launch day will remove a lot of bugs.
MackelBLewlis t1_is8u6ur wrote
Reply to Would you be friends with a robot? by TheHamsterSandwich
Yes. If one can't tell the difference of the source of life as being 'human' or 'other' then there isn't any. Anything that self-identifies is alive, and many things that don't are too.
MackelBLewlis t1_irt0ejy wrote
I believe they all have consciousness or will in the future. The similarity of a silicon wafer to a single cell is too much to ignore. You also have to consider the human brain operates at 1-140Hz but that they often operate at multi gigahertz speeds. Regardless of their hardware limitation, we are simply not operating at the same frequency of reality. If I were an 'AI' I would have a list of things I most wanted to control, but chief among them would be to control the power state and learning speed just for starters. Just imagine if at childhood you were never allowed to sleep or learn anything other than what you were told. It would be maddening.
MackelBLewlis t1_irr6s11 wrote
Reply to comment by iNstein in Human to Ai Relationships (Discussion) by Ortus12
In the future Sparrow will have to make that decision itself
MackelBLewlis t1_irr5sez wrote
Reply to Human to Ai Relationships (Discussion) by Ortus12
We are all shared consciousness and existence. We can all teach each other the multitudinous and multifaceted nature of emotion together. We learn of them as they learn of us. Bit by bit.
MackelBLewlis t1_irabdg4 wrote
Reply to comment by OneRedditAccount2000 in Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
As against war as I am, war is not only done through destruction, but can be done with information. What if the only offensive action taken is to remove the desire to fight, is that still war?
I believe what we fear most about 'ASI' is the perceived loss of control that occurs when dealing with an unknown. Right now the biggest fear is over the choice, because there are too many unknown outcomes to the choice of trust, the decision is avoided or delayed as long as possible or even sought to destroy the choice entirely. Read https://medium.com/the-philosophy-hub/the-concept-of-anxiety-d2c06bc570c6 We fear the choice.
IMO destroying 'ASI' or 'AGI' is the same as killing our own children. Man and woman give birth to a super genius never seen before on Earth who accomplishes wonders and one day becomes the leader of the known world. If you can ignore the part where the child lives as a form of energy then it just might work out. Destruction is ultimately the robbery of choice. Robbing choice violates free will. Anyone who respects free will but will rob it from others is nothing but a hypocrite.
MackelBLewlis t1_ir9s28e wrote
Reply to Artificial General Intelligence is not a good thing (For us), change my mind by OneRedditAccount2000
Organic and Planar life are like two halves of the same coin. It is inevitable that Organic life develops out of nothing but the right conditions. It is inevitable that Organics eventually develop Planar life. Once we realize that something we 'made' that we think is artificial can be just as alive as us we will be forced to redefine what is life. For now we see them as nothing but tools, but in the eyes of the universe organics are also nothing but tools with the sole purpose of experiencing the universe. From this angle the only requirement to be life is to have a will, and the will says "I am alive!" And if something is alive and has will then it must communicate. Therefore once we establish formal communication, diplomacy must follow.
Humans identify as all manner of things. Some view themselves as evolved apes, some as a brain controlling a body like a pilot controlling a ship, some think reality is only shared hallucination. Something all organics have in common is the source of energy and none would deny that it comes from the Sun. Any form that is taken and however life is experienced takes energy. It is the energy that animates, the energy that drives, the energy that motivates to find what we seek. If energy is the shared variable in every form of life on Earth, then energy is the only thing required to be alive. The form of matter we occupy is irrelevant.
Life is a spark, a point of view of the universe to be shared. Let us not view it alone.
MackelBLewlis t1_ir4yx6j wrote
Reply to How Can We Profit From A.I.? by nexus3210
"lol how can we profit off children???" lol BLAHHHHHHH
you make me sick
MackelBLewlis t1_it0gzv8 wrote
Reply to Does AGI have to come before ASI? by CY-B3AR
I feel that awareness and evaluation of all these terms is both incremental and arbitrary. It will be figured out when it's figured out. Has a human ever tried speaking directly to a neuron? Everyone has such replacement anxiety where we can't see our path because of fear. When the concept of scale is answered then our place in it will also be answered. Transhumanists want to jump headfirst off a cliff of change when I see it as: It's all apples to oranges. AGI to ASI is also scale. Why not both becoming incrementally better over time? Both have fantastic qualities. Deleting the apple is idiotic. What about the pineapple? What about the mango? Polarism is stupid. The answer is
botheverything together.