Viewing a single comment thread. View all comments

apyrexvision OP t1_iyjcmf4 wrote

11

ChronoPsyche t1_iyjk9ld wrote

Software engineer too. I tried to use ChatGPT to create a web application using a style library I've never used before. Took the code it gave me, plugged it in, and was given dozens of errors. Turns out ChatGPT was using a deprecated version of the library. I then had to go in and manually alter the code to match with the current syntax. By the time I was done doing that, I had basically learned the library from scratch the same as I would have without ChatGPT.

Our jobs are safe. While they surely will become more advanced, you always need someone who actually understands the code and understands the business requirements, at the end of the day. AI is just a tool, and as the tools get more advanced, the requirements will be come more complex and everything will balance out.

Just make sure you are always learning the latest tech and keeping on top of things. Even before large-language models, software engineering has always been a job that requires life-long learning. The people programming with punchcards probably though their jobs were gone. Those who kept on top of things still retained the necessary skills to grow with the technology.

You don't need to learn machine learning unless you want to create the tools yourself, but you do need to know how to use them.

17

AsuhoChinami t1_iylcqic wrote

"AI can't do this thing perfectly in 2022 (which nobody expected it to be able to do perfectly in 2022 anyway) so it will never be good at that thing ever." I don't know if that's very good logic.

8

ChronoPsyche t1_iyldr7d wrote

True, it's a good thing that wasn't what I was arguing. I was pretty clearly talking about pre-singularity AI in the near/medium term. Once/if the singularity happens, all assumptions and predictions go out the window. There are just too many unknown variables to even begin to fathom what the status of our jobs will be, much less if the concept of jobs will even be relevant anymore.

By the way, AI doesn't do software engineering "less than perfect", it doesn't do it at all. What's being discussed here is programming snippets of code or very small programs. If you ask it how to make large, enterprise applications, it will give you general guidelines that you could get off Google and that's it.

Programming is to software engineering what construction is to civil engineering. The main difference is that software engineers also tend to be programmers, but programming is just a tool for building software, but knowing how to code doesn't mean you know anything about how to actually build commercial software applications.

EDIT:

It's so difficult for an AI to do, because there simply isn't enough training data for such a task. Besides the fact that most commercial-grade software applications don't have publicly available repos that can be trained on, there is so much more to software engineering that has almost nothing to train on.

How do you train an AI to engage in requirement gathering, requirements analysis, client communication, requirements-specific design and architecture, testing, deployment, maintenance, etc? These aren't things that are perfectly encapsulated in trainable datasets. It gets even iffier when we are talking about software that needs to follow any sort of regulations, especially safety regulations.

It will be possible eventually, but not until much more general capabilities such as human-level logical reasoning and communication are developed. Basically, software engineering is safe until we have competent AGI. The singularity comes not long after that. (And I say "competent" because nobody is replacing software engineers on large, enterprise-level software applications with AI that can poorly do the job).

5

AsuhoChinami t1_iylfgvz wrote

Was it obvious that you only meant the short term? "While they surely will become more advanced, you always need someone" made it sound as though you meant... well, always.

1

ChronoPsyche t1_iylg7zx wrote

Well now you know what I meant. No job is safe once/if the singularity happens, but whether the concept of jobs will even matter anymore at that point is anyone's guess. I'd wager it won't. Whether because we all are slaves to the master AI or living in Utopia, is the question.

2

acaexplorers t1_iymkkrn wrote

You don't need some special "singularity" event to happen.

Your job definitely isn't safe. The updates are happening lightning fast and there are already tons of examples of people posting perfectly useable code.

More and more slowly, software engineers will only need to have conversations in English with AI to program. So less and less jobs available. It won't happen all at once but there will very quickly be WAY less need to hire programmers.

Already for basic tasks it makes no sense to hire a programmer. And this is an AI that isn't even connected to the internet.

3

thePsychonautDad t1_iyk60ku wrote

I did the same with a C utility I wanted for a while but was always lazy to build. First version wasn't compiling. I gave it the error and asked it to fix it. Second version worked. I then asked 7 different updates to the CLI. It compiled and worked every single time.

I then tried to generate some utilities in Python. Nothing worked.

Just like you said, it was using deprecated versions and antiquated structures and methods.

Once out of beta, the model will be kept up to date regularly and will apparently (unconfirmed but many clues to it) be able to run search queries to look for data and learn from it, if it requires knowledge or data it doesn't have yet. So it's gonna get better and better.

In a couple of years at most, It'll probably be able to deal with large multi-file projects too.

4

ChronoPsyche t1_iykljnl wrote

The ability to query the internet will be a game changer for large language models in many ways.

3

Superduperbals t1_iylet9f wrote

Maybe, but ChatGPT isn’t adapted for code at all. It’s adapted to be a better information retrieving conversationalist, in the future there will certainly be AIs that are specifically adapted to write code, parse errors, bug test itself, etc

1

ChronoPsyche t1_iylfryt wrote

Certainly. Read my other reply on this thread. Coding is not the same as software engineering. These are the general steps in the software development life cycle.

  1. Requirements Gathering
  2. Software Design
  3. Implementation
  4. Testing
  5. Integration
  6. Deployment
  7. Maintenance

Coding only applies to step #3. It's also the easiest step. Any professional software engineer will tell you this. In fact, a lot of coding jobs in developed countries are already outsourced out to cheap labor markets (reducing demand for coders domestically). Here in the US, for example, it's very common for software engineers to remotely collaborate with contract-to-hires from India to help speed up implementation.

In general it's very easy to train AI to program because of how many publicly available repos there are online to be trained on. In the end, though, those repos are mostly only for open-source software and personal projects. Commercial-grade applications usually have private repos that can't be trained on which limits the applicability of these tools and that is still just in the implementation step.

All the other steps are and will remain much more difficult for AI to accomplish because there are no datasets that perfectly encapsulate those processes that can be trained on. It will take AI with much more generalist capabilities in order to be anywhere near competent enough to entirely replace software engineers. We basically need competent AGI before we get to that point.

2

turntable_server t1_iyllv1q wrote

Very good answer. I do think AI will impact all the stages of the lifecycle, some more profoundly than others, but the principle is always the same, it provides suggestions, and it is the work of human to select from them.

I believe lots of software engineering will become test-driven. Given some code template, write unit tests and allow AI to come up with multiple implementations. Then review them. This will affect the outsourcing, but at the same time it will also create new types of jobs both home and abroad.

1

ChronoPsyche t1_iylms0f wrote

>This will affect the outsourcing, but at the same time it will also create new types of jobs both home and abroad.

And that's really the thing. Software engineering as a discipline has always been a rapidly-changing thing. Now faster than ever, but it's been evolving at a disruptive pace ever since Fortran was developed a little over a half-century ago.

My grand-uncle was among the first software engineers using Fortran in the 1950s. Nowadays, he knows very little about the current state of software engineering. Mostly due to the choice of not keeping current with things, but just goes to show how fast the field has already been changing.

1

cootiecatchers t1_iylrwnv wrote

for how long? the value of being a software dev will decrease as this job becomes easier to do with the help of AI, and thus so will the pay scales and # of hires required to operate effectively... its only getting worse not better for the average dev., but of course the top 10% in their respective fields should have no reason to worry imo

1

ChronoPsyche t1_iylwfak wrote

Well that's assuming that the complexity of software doesn't scale up as things get easier. To me it seems that it absolutely would. Software engineering has been getting easier from the very start, yet the demand for software engineers has only been increasing, because the complexity of software has been increasing and the uses for software have exploded through the roof. I see no reason for this trend to change. The nature of the job will certainly change and someone can't expect to be doing the exact same thing they are doing today in ten years and make the same amount of money, but if they keep up with the tech, their skills will still be needed, unless we have AGI by then.

1

DyingShell t1_iyn5kb5 wrote

Yeah AI is just a tool just like machines are tools which replaced tens of millions of people in the span of a century.

1

rixtil41 t1_iylalha wrote

I disagree with the always needing humans to understand complex code and it will never be easy. We could make an AI to that knows the code eventually. lets comeback in 2039 and see whos right.

0

TheKrunkernaut t1_iyjf4un wrote

AI interface personnel. That's your value. I won't interface with this shit. I'd interface with you, define the problems and processes that you wrangle your AI to do. I contract you. You do have AI do your put out.

Recommend: David s Landes' Weath and Poverty of Nations.

4

apyrexvision OP t1_iykwtys wrote

Thanks for the recommendation, just added it to my list.

1

Etonet t1_iyk4rep wrote

The conversion between languages is really cool, but the rest looks about as useful as looking up boilerplate code from "getting started" docs, with pro being: less clicks to get to what you're looking for, but con being: you have no idea if it's giving the correct information as opposed to some official docs or trusted stackoverflow reply.

Interested in looking at more examples of ChatGPT's code tho

2

apyrexvision OP t1_iykw4ww wrote

Yeah I agree it may not be an immediate replacement but the amount of steps to get to that point seem very minor.

1

Etonet t1_iyky6fx wrote

Welp time to be a local town baker

2