Viewing a single comment thread. View all comments

Coachtzu t1_jcdxi2h wrote

You're cherry picking. He addresses this in the article. We can't afford to be left behind, yet we also don't understand what we are racing towards.

Automation has also already cost jobs. It will cost more. This is not controversial. We need to figure out how we adapt to a world where our work does not and should not define us.

20

iStoleTheHobo t1_jce0127 wrote

>We need to figure out how we adapt to a world where our work does not and should not define us.

Precisely. Nobody seems to talk about this particular point but let's put it like this: If the artificial intelligence revolution will be bigger than the splitting of the atom why the hell would we allow the private sector to govern these tools? Do we allow private companies to handle atom bombs?

15

Cheapskate-DM t1_jce1kn8 wrote

Atom bombs require uranium. Uranium comes from mines. Mines occupy land. And if governance has any talent which it can reliably manage, it's keeping people away from a given piece of land.

Code has no such restriction.

5

Codydw12 t1_jcf9wnq wrote

> You're cherry picking. He addresses this in the article. We can't afford to be left behind, yet we also don't understand what we are racing towards.

> > But I don’t think these laundry lists of the obvious do much to prepare us. We can plan for what we can predict (though it is telling that, for the most part, we haven’t). What’s coming will be weirder. I use that term here in a specific way. In his book “High Weirdness,” Erik Davis, the historian of Californian counterculture, describes weird things as “anomalous — they deviate from the norms of informed expectation and challenge established explanations, sometimes quite radically.” That is the world we’re building.

> > I cannot emphasize this enough: We do not understand these systems, and it’s not clear we even can. I don’t mean that we cannot offer a high-level account of the basic functions: These are typically probabilistic algorithms trained on digital information that make predictions about the next word in a sentence, or an image in a sequence, or some other relationship between abstractions that it can statistically model. But zoom into specifics and the picture dissolves into computational static.

> > That is perhaps the weirdest thing about what we are building: The “thinking,” for lack of a better word, is utterly inhuman, but we have trained it to present as deeply human. And the more inhuman the systems get — the more billions of connections they draw and layers and parameters and nodes and computing power they acquire — the more human they seem to us.

None of this seems actually profound or useful to me. Saying that the AIs that we build will be alien to our own thinking? To me that, in his own words, is in the laundry list of obvious.

> Automation has also already cost jobs. It will cost more. This is not controversial. We need to figure out how we adapt to a world where our work does not and should not define us.

And that I fully agree with but every time I suggest heavily taxing automated jobs as a means to fund Universal Basic Income I have hypercapitalists call me a socialist for believing people should be allowed to live without the need of working.

5

Coachtzu t1_jcfatna wrote

>None of this seems actually profound or useful to me. Saying that the AIs that we build will be alien to our own thinking? To me that, in his own words, is in the laundry list of obvious.

I don't know if I think it's profound either, but I do think it's a healthy reminder. Its a good reminder that we don't really understand these algorithms, and that regardless of how human-presenting they are, they are not human and we can't trust them to act in certain ways. Maybe not particularly helpful, but worthwhile none the less (in my opinion).

>And that I fully agree with but every time I suggest heavily taxing automated jobs as a means to fund Universal Basic Income I have hypercapitalists call me a socialist for believing people should be allowed to live without the need of working.

This has happened to me too, I've suggested exactly the same thing (though admittedly stole the idea from mark Cuban when he guest hosted on a podcast at one point). At this point everything is socialist if it's different than the status quo though so I try to ignore it.

2

Codydw12 t1_jcfhsmv wrote

>I don't know if I think it's profound either, but I do think it's a healthy reminder. Its a good reminder that we don't really understand these algorithms, and that regardless of how human-presenting they are, they are not human and we can't trust them to act in certain ways. Maybe not particularly helpful, but worthwhile none the less (in my opinion).

And this is fair. AI will not act like a human nor will it be completely logical in every aspect. We don't actually know how one will act or react or what its been trained on.

> This has happened to me too, I've suggested exactly the same thing (though admittedly stole the idea from mark Cuban when he guest hosted on a podcast at one point). At this point everything is socialist if it's different than the status quo though so I try to ignore it.

Indeed. I have given up on trying to predict future economies but the current system won't work much longer.

2

Coachtzu t1_jcfpowm wrote

100% agree. Appreciate the good discourse, it's hard to find on here.

3

Lettuphant t1_jcfdtv1 wrote

This reminds me of two early examples of evolutionary AI design (though I doubt I could find the details, these are from interviews long ago: one was a circuit board that an AI designed which looked non-fictional and that no human would have designed, but it worked. Best guess was the EM of one part was interacting with another.

The other is some guys who were trying to build the lightest possible body for a drone, and set an AI to building it. They 3D printed it and when their buddy who was a veterinarian walked in he said "hey, why do you have a flying squirrel skeleton here?" AI doing what natural selection took millions of years to, running through the iterations in milliseconds rather than generations.

2