WanderingPulsar

WanderingPulsar t1_jdqu8kv wrote

Which humans tho, someones rise will make others' demise, unless we dictate everyone a system regardless of what they want... Even that will cause some people to suffer.

There is no monolithistic morale point. Its either us, or ai, to decide which fingers are to be seperated away from the rest. I think its more ethical to let the ai to question itself and come to one decision by itself

−1

WanderingPulsar t1_jcjb668 wrote

I don't think it will has any "evil" targets/morals. Evolutionary learning algorithms work in evolutionary way, that, less efficient code gets executed, most efficient code gets released, while mutations of the most efficient code have put to a new rounds of tests.

In singularity AI scenario, there would most likely be crazy amount of AI competing each other over internet, instead of having just a single one. Thus, less efficient codes will have no chance to push it's code seedlings forward.

What would that mean for humans? Well, it's up to us, how efficient we are. How much watts worth waste we create per kwh energy consumption. If we were to be wasteful, that would be seen as an obsticle in the eyes of an AI thats shaped by and for efficiency.

5