eve_of_distraction

eve_of_distraction t1_j2dbyi1 wrote

Reply to comment by Lawjarp2 in Plan For the Singularity by tedd321

You said it's not a good idea. There's no projection going on here. When people say that's not a good idea, they're advising against it. Just have the intellectual integrity to own what you said. 🙄

1

eve_of_distraction t1_j2d9nuh wrote

Reply to comment by Lawjarp2 in Plan For the Singularity by tedd321

You're straight up backtracking. You said it was a bad idea. That's advice, stop mincing words.

>Stop getting triggered by everything you see and be more pragmatic.

I don't get triggered by everything I see, only foolish things. Claiming that investing is a bad idea because money will soon be obsolete is advice, and it's foolish advice.

0

eve_of_distraction t1_j2ctpbm wrote

Reply to comment by Lawjarp2 in Plan For the Singularity by tedd321

You're advising someone against living within their means and investing because "money may not retain it's value in a post singularity world." Do you realise how cult-like and irresponsible that advice is? I have to say I'm really starting to dislike this sub more and more.

0

eve_of_distraction t1_j2a5n0c wrote

Yeah I feel like all billionaires are being painted with a broad brush here. Some Saudi prince who inherits billions with few responsibilities might be indolent, capricious and hedonistic but these captain of industry types? They live and thrive in high stress ultra-competitive situations their entire careers.

1

eve_of_distraction t1_j287lny wrote

>But imo if you have an inclination and you put in the hours you could almost certainly do it(at least at industry jobs).

Yet most artists have the inclination and put in the hours and don't succeed.

>Obviously there's a lot of luck involved

More than nearly any other profession I'd say, along with being an author.

1

eve_of_distraction t1_j1swn9h wrote

I don't think there are any professions that are going to be safe from AI, so I would say it's worth continuing to study what interests us and do what we enjoy. We'll all be in the same boat economically as AI obsoletes so many fields of human work. We're going to need UBI as a society.

1

eve_of_distraction t1_j1m46bo wrote

It's a tricky one, because some art is rare due to the artists being dead, etc. That means the market will value it's scarcity whether we like it or not. I agree though, the art world is absolutely filled with snobbery. Keep in mind I'm also talking about graphic design here, which I worked in for years. It can be a very economically unreliable profession to put it mildly.

3

eve_of_distraction t1_j1lts72 wrote

The vast, vast majority of artists throughout history have never been able to make an income. So the complaining we are seeing now is just coming from a privileged few. Throughout history the most common experience of being an artist has always been that you livelihood essentially never existed in the first place. I'm not targeting this blog specifically, I'm just pushing back against the recent sentiment.

10

eve_of_distraction t1_j1iu8um wrote

Reply to Hype bubble by fortunum

Look, I think we can all agree that by the end of the year, roughly five days from now, the entire cosmos will have been converted into computronium. I don't think that's an unreasonable timeframe.

4

eve_of_distraction t1_j1ipzo8 wrote

Reply to comment by Chad_Nauseam in Hype bubble by fortunum

Why would an AGI waste precious time and energy making paperclips? That would be crazy. Clearly the superior goal would be to convert the entire cosmos into molecular scale smiley faces.

2

eve_of_distraction t1_j1ip6o9 wrote

Reply to comment by theotherquantumjim in Hype bubble by fortunum

>Is it not reasonable to posit that AGI doesn’t need consciousness though?

It's very reasonable. It's entirely possible that silicon consciousness is impossible to create. I don't see why subjective experience is necessary for AGI. I used to think it would be, but I changed my mind.

1

eve_of_distraction t1_j0q0pwp wrote

We've been arguing about what is good for thousands of years, but we tend to have an intuition as to what isn't good. You know, things that cause humans to suffer and die. Those are things we probably want to steer any hypothetical future superintelligence away from, if we can. It's very unclear as to whether we can though. The alignment problem is potentially highly disturbing.

10