crap_punchline

crap_punchline t1_j7qgpey wrote

So if the AI can't do a job application, is it ethical if the AI is helping you to do work? How is that fair to other people who also do your job? That means the next time you compose your resume, your achievements will have been AI assisted, which is also unfair to other applicants.

Will AI be able to fucking help you with anything whatsoever? lol, so fucking shit

24

crap_punchline t1_j78w9ly wrote

They won't get automated. They work as farm labourers, carpenters, seamstresses, blacksmiths etc who work for each other on land they own. They never used the tech in the first place.

It's an interesting point though as this points towards a great bifurcation of humanity; between humans and transhumans. I think you will see a new class of landowners who will want to perpetuate life as it roughly exists for humans today, a sort of Amish for the late 20th century, and lots of people who feel aggrieved by automation that they want to continue to live a life they recognise.

The rest of us will be more open to the rapid changes on the horizon and view technological augmentation of humanity an extension of our identity, or a means of escaping the trappings of humanity altogether.

82

crap_punchline t1_j76zxow wrote

Integrating the next model of GPT into Search is literally the most boring application of this tech I can think of.

It is so low yield it isn't worth even thinking about. It'll be useful for kids under the age of 16 doing homework or for people to get recipes. It'll probably save a tiny fraction of search users a few seconds from the answer they would have gotten from Wikipedia or another website.

Yawn. Yawn yawn yawn.

1

crap_punchline t1_j73s62t wrote

I think to understand the impacts of AI, we need to look at the earliest effects that we've already seen:

  • Both the proprietary models and open source free models have become sufficiently powerful to put some people out of work (independent animé illustrators for one). I doubt they have been able to immediately walk into another job because it takes a lot of time to learn new skills and they too are at very near risk of automation.
  • The way in which people have benefitted from the AIs so far is that it gives people the power to create things immediately for next to nothing. That also means they're worth next to nothing.
  • The effect then is driving the cost of goods and services close to zero but making them vastly more available and virtually instantaneous.

The only way to make money out of this is to own a slice of the means of production and that means having shares in Microsoft and Alphabet. If you don't own a slice of this then the only way you benefit is from the cost of goods and services being driven to zero.

This process increasingly closes the door to everybody of the concept of trying to get richer than other people. You will either be in the class of people who will be getting wealthier through having a slice of capital that recursively sucks up productivity by owning the means of production or you will be in the class of people who will be able to live increasingly affordably but won't be able to obtain power and luxury.

Ultimately, everybody will gain from this situation but some will gain more massively than others.

The battle will be when people say "OpenAI & Alphabet used all of our work output to create these new means of production" and demand that this is divided up to society. Politicians will probably not care in sufficient quantities as many of them will be invested in these companies anyway. Plus, everybody's quality of life will be rapidly improving, so why rock the boat?

3

crap_punchline t1_j1j4l5g wrote

Reply to comment by fortunum in Hype bubble by fortunum

lol shit like this has been debated ad infinitum on this subreddit, imagine wading in here saying "HOLD ON EVERYBODY PhD COMING THROUGH - YES, I SAID PhD, YOU'RE DOING IT ALL WRONG" and then laying on some of the most basic bitch questions we shit out on the daily

get a job at OpenAI or DeepMind then get back to us

−2