Comments

You must log in or register to comment.

bloxxed t1_j6ehpor wrote

More automation is ultimately a good thing, but at the same time I find this news somewhat disconcerting as a college senior who just switched out of nursing into comp sci to pursue a career in web development. Considering it'll be two years before I get my degree, will I be screwed by the time I graduate?

Then again, with the release of each new model, paper, etc. it seems more and more likely that all knowledge-based professions are at risk of being automated sooner rather than later. Here's hoping for UBI in the near future, I suppose.

39

Iamatomato445 t1_j6ejmvu wrote

Interesting. I’m pursuing counseling. With grad school, it’ll be 4-5 years before I’m done with school. I wonder what this tech will look like by the time I am licensed . Surely this will find itself into the mental health field as chatbots and AI avatar/companions. I already saw one ai avatar that looks scarily realistic. Only a matter of time that this tech starts to compete with humans in every single field .

11

crua9 t1_j6fapk9 wrote

>will I be screwed by the time I graduate?

Doubt it. But likely 5 years in your job if you get one, you might be.

>it seems more and more likely that all knowledge-based professions are at risk of being automated sooner rather than later.

Look at robots. Mix the 2, and it is most jobs.

8

TheTomatoBoy9 t1_j6h9b5o wrote

Except robots will take a decade+ before being even remotely proficient. And that doesn't take into account the massive manufacturing network needed to be built from scratch to produce enough robots to even remotely affect the world.

So that's what? 20-30 years minimum before you see any visible change to manual jobs requiring fine motor skills starting to get some automation pressure? And that's probably an optimistic scenario.

1

Nervous-Newt848 t1_j6eyxz1 wrote

I would focus on data science if I were you. It's a subfield of computer science which works with neural networks and machine learning. There won't be much need for web developers in a few years, but there will be a large need for data scientists.

5

[deleted] t1_j6emqe2 wrote

[deleted]

4

madvanillin t1_j6f8kep wrote

The skeptics of AI have been losing hard the past several years. GPT4 is due out this year. We'll probably have GPT5 by 2026. GPT5 will likely be the first true AGI. We're almost certainly looking at self-improving ASI before the end of this decade. If I were a high school senior right now, I'd be looking into learning a trade, and preparing for robots to take my job in the next 20 or so years. People who work from desks will be replaced first. Not all desk jobs are going away in 10 years, but most of them will.

My big question is what the wealthy and powerful will do with us when a workforce of billions of people is no longer needed to sustain their desired lifestyles. Right now, rich people need poor and middle class people to do work for them, and design and create and build things for them. Then those people need millions more working to supply their needs, educate their children, and so on. But when rich people no longer need poor and middle class people to keep them luxury, they could just exterminate us. I'm hoping they'll give us a basic income in exchange for voluntary sterilization. But I believe the desire to prevent environmental disasters and stop the anthropocene extinction will motivate them to drastically reduce the number of humans on earth.

Maybe ASI will be too clever, too powerful, and too kind to allow them to continue their lives of extraordinary luxury. Maybe it will eliminate ideas like wealth and trade, and give us a star-trek-style post-scarcity world. Idk.

5

Ok_Homework9290 t1_j6fq2n6 wrote

>GPT4 is due out this year.

OpenAI's CEO said they're planning on holding on it to it much longer than most techies would like, so I wouldn't be surprised if it wasn't released this year.

>GPT5 will likely be the first true AGI. We're almost certainly looking at self-improving ASI before the end of this decade.

Doubt it. That will probably come out at some point later this decade, and I doubt we'll get AGI that quick. The vast majority of AI/ML experts expect it come later than this decade, with most expecting it to arrive in 2050+.

>If I were a high school senior right now, I'd be looking into learning a trade, and preparing for robots to take my job in the next 20 or so years. People who work from desks will be replaced first. Not all desk jobs are going away in 10 years, but most of them will.

Learning a trade is awesome and a good idea, but I don't think its trade or bust (in regards to choosing something to learn/study after high school), because I don't think that most desk jobs will have been automated in 10 years.

Knowledge work (in general) is a lot more than just crunching numbers, shuffling papers, etc. Anybody who works in a knowledge-based field (or is familiar with a knowledge-based field) knows this.

AI that's capable of fully replacing what a significant amount of knowledge workers do is still pretty far out, IMO, given how much human interaction, task variety/diversity, abstract thinking, precision, etc. is involved in much of knowledge work (not to mention legal hurdles, adoption, etc).

Will some of these jobs dissappear over the next 10 years? 100%. There's no point in even denying that, nor is there any point in denying that much of the rest of knowledge work will undoubtedly change over the next time span and even more so after that, but I'm pretty confident we're a ways away from it being totally disrupted by AI.

4

fhayde t1_j6eiz9v wrote

Don’t think about a career as something you do for money. Try and think of a career as something you get paid to do. It’s a subtle difference, but on one hand, it’s easy to grow tired of doing something you’re not interested in and don’t have a pull towards. On the other, there may be some things in your life you would do even if someone isn’t paying you to do it. Try and find a way to make the latter overlap with your day job and it won’t matter what happens in these fields. A strong interest in something cultivates mastery and expertise and that will be worth something to someone, if no one else, at least yourself.

3

superluminary t1_j6eygaa wrote

Tend to agree. A career is about finding a path through life that suits you which also brings in money. You move from place to place, ideally avoiding things you hate and finding what fulfilment you can.

2

Belostoma t1_j6erizs wrote

You'll be fine.

AI is not going to be able to really replace programmers for a long time, and the people saying it will just don't have much experience with programming in the real world. It's going to take widespread AGI before programmers are obsolete, and it's hard to predict when that will happen, but it's not 2 years.

Learning a programming language is pretty easy, and AI can do that. The hard, time-consuming part is figuring out what specifically you want the computer to do in the first place. Once you do that, expressing it precisely in terms of a programming language is no more difficult (and often easier) than expressing it in natural language. The other huge part of the job is debugging or optimizing code, which requires some deep understanding of how the many pieces of a complex system work together. This is an enormous leap beyond the capabilities of the AI tech that is currently impressing everyone. It's not impossible for theoretical future AI, but it's not just an incremental improvement over models like GPT. It's a whole different kind of thing.

I expect AI to provide increasingly impressive autocomplete features to help make programmers more productive. That's still exciting stuff. It might reduce the market for programmers a little bit by making each one more productive, so people don't need to hire as many of them. But it won't replace them until we have true AGI that can actually reason and understand things.

To be clear, I'm not dissing what OpenAI is doing here. I'm excited for it. But people on this sub especially are badly misjudging some of its implications.

1

nutidizen t1_j6g5nnm wrote

AGI can do everything human can. And if AGI comes in 6 years...

5

Belostoma t1_j6gewjj wrote

I know AGI can. I'm skeptical that it's only 6 years out at all, let alone only 6 years out from being so widespread that just any employer can hire it at will. My main point is that not-general AI can't even begin to compete with human programmers in the complex jobs most of them actually spend most of their time on. I think humans who can leverage non-general AI to make themselves more productive will be the best programmers for a pretty long time.

1

SurroundSwimming3494 t1_j6gvkgv wrote

It likely won't, though. The vast majority of AI/ML researchers think it'll take longer than that, including the most prominent ones.

1

BootyPatrol1980 t1_j6esxcx wrote

You'll be in great shape, though I wouldn't cling too tightly to the role of a web developer. There will be a lot of work for developers and technology people of all stripes.

One thing I keep highlighting are the places that need people to help build AI and push it forward, and that's not just in the hard algorithmic side. We're going to need to soothe itches in lots of places where these things can't scratch for years to come.

1

Seek_Treasure t1_j6ewf8h wrote

Come on, the role of "web developer" has been automated away years ago.

−1

[deleted] t1_j6f0kvx wrote

[deleted]

3

Seek_Treasure t1_j6jowk5 wrote

Well, that's what I see in my company. No one is writing HTML or raw JS or building Rest APIs anymore. Everything is either generated, or behind many layers of frameworks, or both.

2

BootyPatrol1980 t1_j6ftpht wrote

LOL no.

Maybe for small blogs but premade templates and widgets don't really count when you're talking about real web applications. It's still where most of the dev work goes these days perhaps more than ever.

2

Seek_Treasure t1_j6jpf6k wrote

I see very large web applications from inside. They're not really premade, but we're usually several layers of abstraction above HTML

1

LUNA_underUrsaMajor t1_j6g0cyb wrote

Doesnt this mean people can focus their energy on making coding and programs more advanced than anyone thought possible if people are not wasting time on what is basic stuff.

1

Ok_Homework9290 t1_j6eygwp wrote

>Then again, with the release of each new model, paper, etc. it seems more and more likely that all knowledge-based professions are at risk of being automated sooner rather than later.

I do agree that with the release of each new model we do inch closer to the day when the world of knowledge worker has been greatly disrupted and changed beyond recognition, but I don't think that that day is particularly close.

Knowledge work (in general) is a lot more than just crunching numbers, shuffling papers, etc. Anybody who works in a knowledge-based field (or is familiar with a knowledge-based field) knows this.

AI that's capable of fully replacing what a significant amount of knowledge workers do is still pretty far out, IMO, given how much human interaction, task variety/diversity, abstract thinking, precision, etc. is involved in much of knowledge work (not to mention legal hurdles, adoption, etc).

Will these upcoming models change knowledge work and make some white-collar jobs obsolete over the next 5-10 years? 100%. There's no point in even denying that, nor is there any point in denying that much of the rest of knowledge work will undoubtedly change over the next time span and even more so after that, but I'm pretty confident we're a ways away from it being totally disrupted by AI.

My 2 cents 😊.

0

just-a-dreamer- t1_j6duzha wrote

Of course they hire outside of the US. It is way cheaper.

22

[deleted] t1_j6eswef wrote

[deleted]

1

just-a-dreamer- t1_j6ethlc wrote

They do pay a living wage.

Thing is, a living wage in the US might be 45.000$ as opposed to 8.000$ somewhere else.

10

[deleted] t1_j6j3e5l wrote

[deleted]

0

just-a-dreamer- t1_j6j4ov7 wrote

How so?

They are not your friend or family, they are a business. They also don't care about country or patriotism.

A living wage is different in every part of the world and that is what they pay.

1

TFenrir t1_j6e9idu wrote

This makes a lot of sense.

A lot of what instruct fine tuning and rlhf is that if you provide some high quality, specifically created data to an LLM while it's being fine tuned, you get a significant jump in results for this fine tuned model - versus just giving them more of the same structured data.

In some of the papers I read, a lot of the conclusions are akin to "next steps is trying to see if more instruction data will improve results".

Some of the challenges with this instruction data is that well we just don't have a lot. We don't have for example... A lot of the recordings of people using computers to complete tasks. Like keystrokes and screen recording.

I don't think this sounds like they are getting "screen" recordings (AdeptAI for example is doing that with their model, but with a browser only for now). It sounds more like just accompanying natural language descriptions with the fine tuned data is enough to get an improvement. Which makes sense from my limited experience with LLMs.

Should be interesting. I imagine this is for fine tuning GPT4. The "Codex 2.0", better base model (GPT), better instruct tuning probably as well.

17

crua9 t1_j6fagpj wrote

:)

So I've been wanting to make a few apps for a LONG time. These aren't simple and requires AR. I've tried many times to make it, and I did try with Chat a month ago but no luck. Since it isn't built for coding I figure well, it was a nice try. Like I think it got me 90% there, but I have a ton of errors.

Anyways, hopefully this will make it possible.

5

NodeTraverser t1_j6flkbk wrote

Estimates on what month Sam will fire all the humans and let GPT handle all dev going forward? Could be substantial savings.

Estimates on when GPT will fire Sam? Just imagine, the last big expense cut out of the loop.

5

insectula t1_j6g7l46 wrote

...and if it becomes good enough at coding, it can start to work on programming a better version of itself...

5

lovetheoceanfl t1_j6f6hjb wrote

Can’t wait to hear all the programmers say they aren’t worried. That’s been the basic gist of this sub for awhile.

4

Coolguy123456789012 t1_j6fiahx wrote

Yeah, they all claim that their job is much more complex than can be automated and then their explanation fails to describe anything that can't be automated. My guess is we see programming jobs halved at least. Maybe in the long term some of those jobs shift towards working with LLMs or AI but in the short term a lot of people are going to be out of work.

6

SurroundSwimming3494 t1_j6fs04j wrote

I mean, they can of course be biased, but at the end of the day they know the most about what their jobs entail and how easy/hard it is to automate certain programming tasks, not to mention that some are also at least somewhat familiar with AI, which makes their answers more credible.

2

raylolSW t1_j6fk6ty wrote

I mean I have 0 worries, in fact I’m excited for the future as tech just keeps growing we still have many things left to achieve using coding. Smart cities, VR, AI, etc

No one is really that worried about short term automation outside this sub which sometimes feels like the flat earth society of AI.

2

lovetheoceanfl t1_j6fozir wrote

No one is worried about short term automation except the people outside this sub? You do realize what you just implied right?

1

raylolSW t1_j6fpdsc wrote

Ya, billions of workers are just going on with their lives not knowing what AI even is.

Just go outside and you’ll see workers having dinners, owning houses, driving their cars and living their lives.

1

lovetheoceanfl t1_j6fq47p wrote

I misread what you wrote in your previous comment.

That said, I think a majority people are well aware of AI and do have worries in the back of their minds. Give it a year for full blown panic.

1

MassiveIndependence8 t1_j6om6v8 wrote

That’s not a good thing. If billions of workers won’t realize that they’re replaceable then it will hit them like a truck.

1

BUGFIX-66 t1_j6ei6ib wrote

These large language models can't write (or fix or "understand") software unless they have seen human solutions to a problem. They are essentially interpolators, interpolating human work (https://en.m.wikipedia.org/wiki/Interpolation).

Don't believe me? I built a site to demonstrate this, by testing OUTSIDE the training set. Try it:

https://BUGFIX-66.com

Copilot can solve 6 of these, and only the ones that appear in its training set. ChatGPT solves even fewer, maybe 3.

To test whether ChatGPT can code, you need to give it problems where it hasn't been trained on human solutions to similar or identical problems. Then you need to check the answer, because the language model is dishonest.

It's bogus.

1

WikiSummarizerBot t1_j6ei8g4 wrote

Interpolation

>In the mathematical field of numerical analysis, interpolation is a type of estimation, a method of constructing (finding) new data points based on the range of a discrete set of known data points. In engineering and science, one often has a number of data points, obtained by sampling or experimentation, which represent the values of a function for a limited number of values of the independent variable. It is often required to interpolate; that is, estimate the value of that function for an intermediate value of the independent variable. A closely related problem is the approximation of a complicated function by a simple function.

^([ )^(F.A.Q)^( | )^(Opt Out)^( | )^(Opt Out Of Subreddit)^( | )^(GitHub)^( ] Downvote to remove | v1.5)

1

epSos-DE t1_j6fo8z2 wrote

I would rather trust the Github coding Ai (coding pilot)

​

Tweak it to understand bug fixes and then we have a solid code suggestion AI.

1

Sandbar101 t1_j6h6vch wrote

Good. Probably the most important job that could be automated, will be a massive accelerant for everything else.

1

povlov0987 t1_j6j7zra wrote

If they can replace programmers, they can replace all jobs remotely related to computers. And then robots will finish with hard labor.

3

[deleted] t1_j6dxq2i wrote

[deleted]

−7

rainy_moon_bear t1_j6e0f9v wrote

With a few examples, the model can generate a dataset and fine-tune itself to perform the task without examples.

I'm not saying it is a clear path to AGI, but it's definitely not obvious where this technology will lead to when progressed further.

19

TinyBurbz t1_j6elvz4 wrote

>I'm not saying it is a clear path to AGI, but it's definitely not obvious where this technology will lead to when progressed further.

It's pretty obvious this is the next stage of coding macros and auto-completion.

3

CubeFlipper t1_j6efpxn wrote

This article makes no such claims, and I don't see anyone in this thread making such claims either, so who are you responding to?

7