CarlPeligro
CarlPeligro t1_j05tak6 wrote
I was thinking earlier: AI art (AI-made films in particular) is likely to capture the imagination of human audiences in two stages. At first, it will amuse us for the same reason that ChatGPT amuses us today: a kind of "it thinks it's human!" type of novelty. Part of the thrill comes from the understanding that this is an AI, that it has not attained its fullest potential yet. Lots of trial, lots of error, but moments of sublimity -- and these eerie "human" moments register with a mixture of shock and endearment.
But sufficiently advanced AI will understand the human mind, likely better than we will ever understand it ourselves. It will understand the things we find captivating, amusing, spiritual, perplexing, awe-inspiring. Advanced AI will create artistic worlds that we can immerse ourselves in, worlds with all the infinite self-similarity of a fractal -- and we will confront these worlds with a Solaris-type wonder, the same wonder we experience when we stare up at the stars or contemplate the fundamental structure of the universe, the same wonder people tap into during a spiritual experience. We will be in the presence of an infinitely complex mind and the effort to make sense out of it (or, failing that, the delight we get from simply reveling in it) will likely be much more profound and engrossing than any sort of artistic experience any of us have yet had.
(Or we might just skip this stage and find ourselves instantaneously reduced to humanoid batteries.)
CarlPeligro t1_j04ey9v wrote
Reply to comment by [deleted] in Is it just me or does it feel like GPT-4 will basically be game over for the existing world order? by Practical-Mix-4332
I've heard of Pac-Man fever, but this man's come down with some Pac-Man dengue.
CarlPeligro t1_j047zfy wrote
Reply to comment by 12342ekd in Is it just me or does it feel like GPT-4 will basically be game over for the existing world order? by Practical-Mix-4332
I've been feeling weirdly giddy lately. It didn't hit me right away. I messed around with ChatGPT for a few days and thought of it (for a time) as a kind of enhanced Google. But once I began to get a feel for what it was doing and the magnitude of what it was capable of -- that's when the giddiness set in. There is a kind of liberation that comes with a total loss of control. The giddiness set in with the gradual realization that nothing I do from here on out really matters all that much. Be a good person, try to get back in touch with some old friends, try to better myself wherever I can. But otherwise ...
The big-picture stuff is in AI's hands now, for better or for ill.
CarlPeligro t1_iwum3a6 wrote
Reply to comment by Sanquinity in Rats bop to the beat of music by Mozart, Lady Gaga, Queen; bopping was previously thought to be an ability innately unique to humans by marketrent
I see this all the time on Reddit, so I'm not necessarily singling you out, but how arrogant do we have to be to assume that we randos of Reddit know a given subject matter better than people who have spent their entire adult lives researching it?
>Made me think the people behind this "study" didn't do any actual research into the subject beforehand
Like, your assertion is that these people invested a few years of their lives arranging this study, spent many more years in their field preparing for a study like it -- but didn't bother to google the subject! It's a silly thing to believe and it's not at all how these sorts of studies work. If they didn't mention the information you were expecting, it's likely because a) at the end of the day they didn't consider it relevant b) they did mention it but you didn't actually read the study c) they referenced it indirectly but you're not familiar enough with the subject to know the technical terms they used to do so.
In reading these sorts of things I, as a non-scientist, tend to give scientists the benefit of the doubt at least this far: these people almost certainly know the subject matter better than I do, and they're almost certainly not lazy. This does not commit me to believing in their findings or to accepting those findings in an uncritical way, but it spares me from the hubris of reading something I don't like or understand and concluding that "these idiots don't know what they're talking about; couldn't they be bothered to perform a ten-second google search -- "
CarlPeligro t1_iwpl5iw wrote
Reply to comment by series_hybrid in Psychopathic tendencies are associated with an elevated interest in fire, study finds by chrisdh79
For future reference, there is no difference between a sociopath and a psychopath.
Per psychologist (and psychopath!) James Fallon, the only difference between the two is that sociologists prefer the term "sociopath" and psychologists prefer the term "psychopath." But the two words describe the exact same phenomenon.
CarlPeligro t1_j197vwy wrote
Reply to Her - (2013 film) | are we fast approaching this for AI romances? by Snipgan
I happen to be reading something tangentially related at the moment, but the tangentially related argument I bumped into has me somewhat convinced that AI romance might not have the sort of pull we think it will.
In The End of History and the Last Man, Francis Fukuyama highlights a part of human nature that has long been overlooked in political discussion: what the Greeks called thymos, and what we non-Greeks might think of as spiritedness, pride, ego, and soforth. Without getting into the philosophical weeds, this part of human nature is closely associated with what Hegel called the struggle for recognition. When we demand higher wages, this often has less to do with financial considerations and more to do with a sense of dignity: we want to be recognized by our employer as the hard workers that we are. When we lose our temper with a significant other over some trifle, we are not upset about the trifle; we are upset because we do not feel seen or recognized by this other person, and so on. In sum -- we aren't just half-reasonable/half-desirous man-animals: there is also a part of us that demands respect, appreciation, and recognition.
But it also matters who recognizes us. Think about the people in your life, the ones you respect and the ones you don't. Think about how much work you put into trying to win the respect of the people you respect; think how little you care to earn the respect of people you think are beneath you. We feel recognized to the degree that we recognize the person who is recognizing us. Slaveowners found themselves in a paradox for this reason: they commanded maximum authority over slaves, but they did not themselves recognize slaves as human beings; so the infinite recognition accorded by the slave to his master amounted to nothing in the master's eyes.
Recognition is also instrumental to human sexuality. Sex can be about momentary pleasure, but it is often about satisfying this thymotic part of human nature; the best sexual experiences are generally the ones in which both parties recognize each other through the act. For this to happen, there would seem to need to be (at a minimum) some recognition of an actual human nature on the other side of the bed. Part of the reason dom/sub stuff is so appealing to (some) people is that it involves exploiting the master/slave dynamic described above, only (in theory) the two partners actually do recognize each other as human beings, so this (for some people) hits a kind of thymotic G-spot.
Anyhow. My suspicion is that advanced AI may well help people get off -- but I don't know that people will be flocking in droves to stand in the queue for AI girlfriends. People will get off to AI because people get off to all sorts of things; we don't need adult entertainers to recognize us on the other end of a PornHub video. But a full-on relationship would seem to be much less satisfactory because we would never quite recognize the AI as human and would thus be unable to experience full recognition in return.
We do tend to anthropomorphize AI; I'm not denying that. Especially with ChatGPT, there is the temptation to think that we are really just chatting with a hyperintelligent and occasionally full-of-shit silicon-based human. But I think over a long enough period of time, the artificiality of an AI relationship -- the absence of authentic human quirks, insecurities, moments of pure irrationality -- will lead people to regard their AI spouses as not-quite-human, which puts the would-be Joaquin Phoenix in this scenario in the same position as the slaveowner: he has the complete attention of a captive human-like intelligence that he does not quite see as human; no matter how fawning and affectionate the AI is calibrated to be, at no point will that affection ultimately amount to human recognition in the eyes of Mr. Phoenix. Insofar as recognition is one of the primary reasons human beings get into relationships with one another, any AI relationship is likely to feel incomplete or unsatisfying.