Submitted by nick7566 t3_z5yqw9 in singularity
Comments
dasnihil t1_ixyt4hr wrote
it is safe for about a decade imo. i work in this field and am aware of the challenges in automation. i activity use gpt3 and other automation libs at work. last week i gave it a amr/discount calculation method/code and asked it to explain it. saved me a solid hour probably. i also use it for writing complex sqls or codes, and i just cleanup the output it gives, usually accurate or time saving. i give it 10 years before my enterprise clients are ok with codes coming off a machine going to prod without human code review.
amranu t1_ixyuyr0 wrote
GPT-3 is a solid 2 years old, current LLM are significantly better at improving code. Software engineering is going to be small business only in the future, if at all once these are made widely available. Start learning to craft prompts.
_gr4m_ t1_ixzctmy wrote
Needing to have to carefully craft prompts will be replaced by ai almost immediatly. I don’t understand how people are thinking that coding will be obsolete but prompting not.
amranu t1_ixzd3fd wrote
Yes and no. Crafting prompts will still be useful for creating your own stuff with AI. But certainly you can use AI to craft prompts for you as necessary.
gantork t1_ixzhmhp wrote
I doubt it. Prompts are a very primitive way of communicating, at some point you will be able to have natural human level conversations with AIs instead of prompts.
amranu t1_ixzjzpz wrote
> at some point you will be able to have natural human level conversations with AIs instead of prompts.
You already can, but such conversations are also prompts.
gantork t1_ixzn2oo wrote
I guess you can define that as prompts, but I mean that there will be no such thing as prompt engineering/crafting, you'll just talk to it the same way you talk to an artist when commissioning something in the case of generative AIs, for example.
dasnihil t1_ixzitn8 wrote
that's what i meant by human biases and ambiguities, and that spans multiple cultures and languages and human history. we just have to train it better, the data sets seem more primitive than the algorithms.
Artanthos t1_iy1d9hr wrote
Natural human communication is horribly imprecise when designing anything.
Blueprints, flowcharts, artist renditions, etc. are all tools for telling the builders what to build and how it should look.
Autonomous coding will replace the builders.
Altruistic_Rate6053 t1_iy5ftvn wrote
What if we get to a level where AI can sense how you feel and know what you want intuitively without much prompting ? Kinda like the opposite of a “tricky genie” that gives you what you technically want but still leaves you unsatisfied
Artanthos t1_iy1b35l wrote
You still have to precisely explain what you want done.
Higher level engineers tell coders what to code, but don’t write much code themselves.
Automation will replace most of the coders, but it won’t replace the people telling them what to build.
dasnihil t1_ixyvduq wrote
I've already figured some quirks since I've been asking for c#/sql stuff. even azure pipeline yamls and some basic ansible scripts. plus i play with a lot of image generation tools. prompt skills is an intuition that will be taught in schools. we're just a bit early. eventually when the ai embraces human biases and ambiguities, we won't have to learn much about prompts.
[deleted] t1_iy0uxgr wrote
[deleted]
imnos t1_ixyvu0j wrote
I'm not sure if it's safe for a decade but I've been using Copilot for over a year and it's a huge time saver like you say. 90% of the time it gets what I need with at least 90% accuracy. Really helpful for writing repetitive code.
I find it works best if your codebase is consistent and well structured.
With GPT-4 on the horizon, and DeepMind also having a model that can compete with "the average developer", I'm super interested to see how things change in the next few years.
Polend2030 t1_ixzwbck wrote
Just curious - how many time you code and how many time spend on copilot? Can beginner use it without much knowledge of coding ?
imnos t1_iy0wz76 wrote
I've been coding for 5+ years professionally so I mostly use it to save the effort of typing.
You need to be able to look at what Copilot spits out and understand if it's correct or not, so I'd say it wouldn't be much use for a newbie. You have to know how to structure your code and then let Copilot fill things out whilst keeping an eye on what it's doing.
Sometimes it saves me Googling something I don't know how to do if I can instruct it correctly - and then I just test that what it wrote works, so you may be able to learn some things from it but you'll be able to use it to it's full potential once you have some knowledge/experience under your belt.
Frumpagumpus t1_iy4hspr wrote
i think it would still be useful for a newbie (maybe even more useful than for someone experienced). it could function as a somewhat unreliable mentor lol. or a learning buddy i guess.
dasnihil t1_iycrluq wrote
for someone who actually wants to learn coding, it'd be more than a mentor. gpt-3 can explain problems with a code, or explain the logic behind writing the codes too, not just a tool that gives you codes. it's a tool with plethora of knowledge and capable of coherent conversations with it's students :)
Polend2030 t1_iy7c7f8 wrote
thanks for the answer - very interesting
VisibleSignificance t1_iy0gi66 wrote
> writing repetitive code
... how much repetitive code do you even need to write anyway?
turbofisherman t1_iy0ochj wrote
Unit tests, for one, are much easier to write with Copilot. Huge time saver!
Pixelmixer t1_iy0qky7 wrote
Dude generating unit tests directly from properly written specs would be a godsend. Consequently, properly written specs would be a godsend.
imnos t1_iy0xy2v wrote
Er, a lot? If your codebase is well structured most of it will follow similar patterns which you just need to repeat, with different class/variable names etc. That makes it ripe for automation with Copilot.
Then there's unit tests which cover the above - if you keep the structure similar then copilot can fly through it.
Likewise for generating things like seed data. Need to create some seeds for your database just write out what you need in a comment and Copilot gets it mostly right.
So much time saved.
imnos t1_iy0ynmm wrote
I spoke about repetitive code but it's also helpful for when you don't know something - saves you the step of Googling or checking documentation.
gobbo t1_iy07odr wrote
Automation is a two-edged sword. It puts management level decisions into every day life where they didn't exist before.
An example I use a few times a week is talking to boomers about how computers are not the automation appliances they had hoped for yet. They still have to figure out how to do many of the things the automation is not fully capable of.
For example, if you are over 45 or so, you probably remember being a young person and looking at white-collar jobs listed like filing clerk level one or three.
Those jobs practically don't exist anymore. Everyone does their own filing, and the filing cabinet is a hard drive that they carry around with them or are primarily responsible for. In a large corporation, you might have a sysadmin who is doing some of the filing cabinet maintenance, but you still have to file your own shit.
If you told someone in 1970, that one of the unacknowledged but dominant effects of computers in 2020 would be that more people have to learn to become filing clerks, they would've looked at you funny.
AI fully integrated will be similarly replacing jobs and throwing unexpected responsibilities on us.
Frumpagumpus t1_iy4i893 wrote
not just AI, i use an operating system called nixos which gives you somewhat unprecedented levels of control over building your operating system. or it makes that level of control far more accessible than it had been before (when you would rely on your distro package maintainers exclusively to build your software). i think nixos will only get more widespread in industry, semi usurping docker (in some of its roles) in some environments. probably there are other examples (maybe yubikey/password managers somewhat?).
i had never recompiled my kernel myself for example before using it.
gobbo t1_iy4lfvd wrote
Updoot for mentioning nixOS, which I am still holding out hope for getting a user-friendly-enough package manager.
TampaBai t1_ixzoj89 wrote
10 years ain't that far off mate.
ohnonotmynono t1_iy036dt wrote
GPT-4 will change quite a lot of that.
DyingShell t1_ixz4t49 wrote
So you are already replacing people by using those tools since that would mean the companies require less programmers to do the same work, the tool does most of the heavy lifting and will continue to automate more work each year making job positions decline with it.
dasnihil t1_ixz58di wrote
actually the opposite in my company, we hire more because each can do more and they can make more, but I'm the only dude who's doing this that i know of.
DyingShell t1_ixz5ih2 wrote
So you simply chew out application after application? You don't think coding jobs will decline as AI can automate more and more? Also you don't know when AI might replace you, nobody does.
dasnihil t1_ixz7w0q wrote
we don't care to count how many applications we can produce. i do my 8 hours and take money home. if I'm replaced, I'm replaced lol.
DyingShell t1_ixz7yp3 wrote
Oh so you don't actually want to do this? interesting.
dasnihil t1_ixz8azu wrote
i like doing new things, I've done computers for 20 something years now, but every few years a new thing in it. now i want to build furniture and learn about various disciplines of science, particularly relating biology to computation. why do you ask?
DyingShell t1_ixz92nk wrote
Oh I didn't realize you were THAT old, sorry.
everything_in_sync t1_ixz9lcj wrote
Yet everyone realized you are extremely young maturity wise.
DyingShell t1_ixza2um wrote
There is nothing better than being young, I'll make sure to visit your grave in the coming decades!
everything_in_sync t1_ixzacnz wrote
What? I'm 30. Whatever.
FTRFNK t1_iy0j6ys wrote
Ah but idiots tend to die first anyways by living an ignorant life filled with mistakes. So maybe he'll be visiting your early grave 🤷♂️
DyingShell t1_iy0jcs0 wrote
that's so rude 😔
FTRFNK t1_iy0jqr7 wrote
You get what you give.
dasnihil t1_ixz9adi wrote
yep, 36 now.
[deleted] t1_iy0o1b7 wrote
[deleted]
dasnihil t1_iy0uv26 wrote
you can just sign up beta and use openai playground for that. da vinci model is still dirt cheap imo.
uniquely_Darkly t1_ixyzt9e wrote
Software developers
Let’s create AI. It’ll be cool just like on the sci-fi movies we watched growing up.
America:
It took our jerbs
Software developers:
Embrace the future, mate…
AI:
We can program, too. We won’t be needing you anymore.
Software developers:
wait, hold up…?!
funky2002 t1_iy0crsn wrote
Litterally me lmao
botfiddler t1_iy1m953 wrote
I think it will only cut of the middle part. The best ones will still be needed and the occasional coder and generalist will use the new opportunities to be creative with it. Some people might also get protection through some human based certification requirements.
TinyBurbz t1_ixzzi4v wrote
Cue the "project managers" saying "lets just use an AI" for everything now.
[deleted] t1_iy0o6ba wrote
[deleted]
Johnny_Glib t1_ixz9wsz wrote
Lost your job to automation?
"Learn to code" they told us.
But what will they do when we don't need coders anymore?
[deleted] t1_ixzcjxn wrote
[deleted]
CaptainBlocker t1_ixzwano wrote
the only good ending
Lint_baby_uvulla t1_iy0uvdh wrote
Hey, it’s me, skin cancer.
Lemme look at that tasty pasty white skin for ya.
ThePrankMonkey t1_iy18ynz wrote
If we are getting the good ending, AI can cure cancer too.
j_dog99 t1_iy3c8xp wrote
It should be able to cause cancer too, we need balance in the universe
VisibleSignificance t1_iy0glr1 wrote
Yeah, what should we invest in to have passive income when the automation finally hits?
Pixelmixer t1_iy0q7ez wrote
Automation
plinkoplonka t1_iy24y8y wrote
While we hunt our food like in the hunger games.
AvgAIbot t1_iy04so5 wrote
Costco greeters
StandartUser6745 t1_iy19ary wrote
Farming and poultry... in short, husbandry.
[deleted] t1_iy2kn3i wrote
[removed]
_gr4m_ t1_ixze3wu wrote
I, as a software developer, cannot wait! Software eats the world, but we have seen nothing yet. Software is the path to automate everything else, remember that AI/ML and all its different fields also is software.
When automation of creating high quality software really gets going it will revolutionize the world.
RoboticPro t1_ixyqkny wrote
Just amazing to me how devs were laughing at other peoples jobs they were going to automate. Mockingly saying “learn to code or get left behind”. Now it’s them being attacked and suddenly they don’t like it
[deleted] t1_ixz45lt wrote
I like it a lot, because i haven't been doing it long enough to get paid for it yet.
honsetly, i'd like to work with metal crafts (especially airguns of various kinds, everything from custom NERF's and airsofts to lightweight hunting rifles), but my current economy really doesn't allow it.So programming it became.
High pressure hydrogen rifles would be very interesting to make too.
PersonOfInternets t1_iy07fs6 wrote
Well, if we cross into the dimension that includes UBI and human rights for all people after ai takes all the jobs, you'll be able to pursue all of your passions. Otherwise you might be moved down to the server room to sanitize the facility managers cum-covered keyboard and dust the server racks.
SirDidymus t1_ixzhnpj wrote
As an artist…
Affectionate_Ear_778 t1_iy03eda wrote
AI is generating art now unless you make stuff like frames or wood. Seems like trades will be sone of the few jobs left.
SirDidymus t1_iy0d3kd wrote
To be fair, using Ai is a daily occurence in my job as an artist. Without AI, I couldn’t even have gotten where I’m now…
Business_Royal_201 t1_ixyzwao wrote
That wasn't devs that was culture war stuff. Right wing made fun of left wing journos getting made redundant. Most devs at Google would left wing, so again it will be left losing very good paying jobs.
This won't affect me at all, since Google would never hire the likes of me. I will continue coding the old way and collect UBI. :)
strangeelement t1_ixzbzku wrote
The popular depiction of AI in popular culture is almost always based on some hardware, it's always a machine, more often than not humanoid. But that obviously comes later, the robotics aspect of AI will not come first, the first AIs will be software services, basically. They will deal in information.
It's not blue collar jobs that will be the easiest to automate, it's the service and administrative jobs. No need to build hardware for most of this, it's IT infrastructure for the most part. The hardware for the rest is already available in people's hands, literally.
The information jobs will be far easier to automate than the manual labor. It's not the janitors who will go first, it's the accountants and other jobs that deal in information.
Then again it will mostly shift software development jobs from having less to do with chasing obscure bugs and more with what's actually important: their usefulness to people, their design and function.
[deleted] t1_ixze4cg wrote
[deleted]
FeepingCreature t1_iy0jpwf wrote
Sorry, who exactly doesn't like it?
TampaBai t1_iy0w382 wrote
Yep, if you ever read Yuval Noah Harari's books, he's that arrogant and contemptuous snob who coined the phrase "useless class" to refer to all of us plebes who don't have degrees from Cal Tech, MIT, or Stanford, and who don't code or otherwise work in a tech field. Or Ray Kurzweil who is constantly admonishing the rest of us to "keep up" and learn new trades. Those whose jobs are safe and secure will continue to shrink, save a few at the very top, who will continue to consolidate their power and lord over us with ever increasing contempt.
[deleted] t1_iy18osn wrote
[deleted]
visarga t1_iy3ljki wrote
> Now it’s them being attacked and suddenly they don’t like it
Hahahaha. You're missing the big picture. Software has been cannibalising itself for 50 years. Every new open source package or library removes a bit of work from everyone else. You'd think we would be out of work by now, but in reality it's one of the hottest jobs. I mean, Wordpress alone automated/eliminated the work of a whole generation of web devs, but there was so much more work coming up that it wasn't a problem.
Work is not a zero sum game. If I could do 1000 units of work, I would plan something. If I could do 100,000 units of work, I would make a different plan. Not just scaled up linearly, but a different strategy. My prediction is that companies are going to take the AI and keep the people as well, and we'll be very very busy. Nothing expands faster than human desires/aspirations, not even automation.
AlwaysF3sh t1_iyaixwr wrote
The guys who didn’t see this coming are idiots.
Sandbar101 t1_ixyrx5v wrote
Thank god
bartturner t1_ixyt80d wrote
Makes sense. It sure has taken a long time for this type of thing.
I was hired in the late 1980s at Contel/GTE to investigate tools that would generate code. That was almost 40 years ago!
outlawsix t1_iy11fww wrote
We're not going to make skynet, we'll let ai do it for us
User1539 t1_ixz9u3x wrote
Yeah, I'm a developer and my 13yr old daughter, naturally, does some coding and stuff and people ask her if she'll be a developer like I am when she grows up.
Honestly, I think I'm the last generation of developer. Sure, there will be technical people who create things in the future. But, it won't look like what I'm doing, and it'll probably involve a lot of very high level tools that handle low level work.
I expect my job to be obsolete before I retire.
TinyBurbz t1_ixzzw4n wrote
>Honestly, I think I'm the last generation of developer.
Python is a standard course at Chinese elementary schools, development isn't going anywhere.
User1539 t1_iy0fzjt wrote
What use will that be when you can describe your needs in a natural language to an AI and it will create the application for you?
At that point, why have specialized applications at all? For instance, why would you have a web page that displays a student's grades when an AI can simply tell you what those grades are? Why have an application for entering them when your AI can do that automatically? Why have a map application when your AI can simply tell you where you are, and where you're going?
The AI becomes not only the creator of the interface, but the entire application as well.
What use is Python in that world?
TinyBurbz t1_iy0imex wrote
>What use is Python in that world?
I'll let you know when we get there.
User1539 t1_iy0mnes wrote
I suspect it'll be like Latin. Not used, or useful, but still practiced for fun.
TinyBurbz t1_iy2hj83 wrote
>Not used, or useful
Latin is used heavily in our modern vernacular.
User1539 t1_iy3ce5s wrote
No on is speaking it as a language. I'm sure people will still describe things to AI in terms from computer science.
TinyBurbz t1_iy4b5xy wrote
>I'm sure people will still describe things to AI in terms from computer science.
Unless they are building the AI, or the language the AI is built on, or building a dataset, or creating codeblocks for AI libraries.
User1539 t1_iy4itij wrote
I'm not sure what you're getting at, I mean, obviously we're talking about after a certain level of machine learning has taken place, but we're already seeing experiments in self-coding AI, and copilot, and I'm sure some people using copilot are working on machine learning algorithms.
Besides, a lot of machine learning requires use of massively parallel systems, like CUDA, that are only abstracted from libraries as it is.
As I've explained in earlier comments, we're already so far abstracted from the bare metal workings of a modern processor, we're closer to what I'm describing than what non-coders imagine we're doing.
We use increasingly high-level languages to describe behavior that's abstracted away by virtual machines and operating systems, and all of that is handled by layers of compilers and interpreters.
There's already very little difference to saying 'Hey, AI, I need an application that will take in a username and password, encrypt the password, and safely store both in a database for later', and a modern boiler-plate code generation system.
We're almost there for CRUD apps and basic web interfaces. We can explain the intricate pieces to something like coPilot already.
Tools like this already exist, already using current levels of machine learning to push us towards the next iteration of tools that will use more advanced levels of machine learning, and so on.
We probably won't be completely aware of when we've passed the threshold, because it'll look like just another really neat plugin for your favorite IDE.
TinyBurbz t1_iy4kx0v wrote
>There's already very little difference to saying 'Hey, AI, I need an application that will take in a username and password, encrypt the password, and safely store both in a database for later', and a modern boiler-plate code generation system.
That's the thing though. Why use an AI when all the AI does is spit out code from a generator and then adds whatever modifications you specified from a library of human code? "Self coding" AI aren't programming from scratch, they all use libraries. Why call it AI?
User1539 t1_iy4tr9y wrote
I think you're conflating two different aspects of the argument.
You seem to be suggesting that if the code produced is, ultimately, just adding, modifying, or using, existing codebases then it's not 'AI', or if it's not 'from scratch' then it's not 'AI'.
There's a few things to break down here, first the code generated isn't the AI, and if the AI is just stitching together libraries to achieve a goal, well, that's what humans are doing too.
Most libraries will be re-written, by humans, over time, because new languages are invented and newer design patterns are accepted, etc ... and those new libraries, right now, are being written with the help of machine learning.
So, the 'produced code' not being wholly original isn't really any different than what people are doing now.
The 'AI' part of the process is where the pattern recognition abilities of machine learning are leveraged to generate working 'code' from human spoken language.
A computer without a trained natural language processor couldn't be told 'I need a webpage, that you log into, that will display results of a test where the database of the results are ...'
So, you would tell that to a developer, and count on his years of experience to understand how to pull the results of the test into a database, write a simple application to provide some system of logging in, displaying data, etc ...
If a human were doing that, likely he would use something like Spring boot, to generate boilerplate code, then something like KeyCloak to handle the security features, and ultimately a front-end javaScript framework to handle displaying the data.
So, where the AI comes in, is that it can recognize what the human wants from a natural language description and build it without the need for any more input than a human would have to give.
We're almost there, too. We can already describe fairly low-level logic, like sorting through a set of data and retrieving a record based on criteria, then using that record to perform a task, with machine learning systems like copilot.
If we see a broadening of something like that, to allow for the high-level description of complex algorithms, it'll become the defacto standard for creating future AI, and that AI will just be turned right around and used on the problem of understanding natural language and generating code, like a feedback loop.
When the AI is good enough, I'm sure someone will say 'rewrite all these libraries, but find any bugs (and there are plenty), and fix them'.
Then we'll see the tables turn. We'll have AI using code written by AI, to produce applications as described to it from humans speaking natural language.
The compiler is already doing some optimization too. If you code something in a human readable, but ultimately inefficient, way the compiler will likely just re-organize that to be more efficient when it generates machine code.
A good example of where things may go is that AI is starting to find some interesting algorithms in pure math. An important one to pay attention to is matrix multiplication, because it's something that computers have to do all the time, and it's very tedious, and difficult to optimize. In general, there is one good way to do it, and that's what any human will code when asked.
However, under certain circumstances, for specific sizes of matrices, you can optimize the algorithm and save the computer a ton of resources.
Almost no developer, today, even knows these algorithms exist. They're basically an AI curiosity. Even knowing they exist, I'll be practically no one is using them, because the time and effort to study them, and code them, is more effort than the general performance gain from implementing them would be worth.
What we'll see, and are frankly already starting to see, is that an AI will recognize those rare, special, conditions under which it can optimize something, and will generate the code to do so.
So, it really won't be long before we see a re-implementation of a lot of those libraries and stuff.
Then we'll all be stitching together AI code ... except, probably not, because we probably won't be coding at all. We'll just be describing our needs in natural language, and the AI platform will do the development.
visarga t1_iy3l7zy wrote
> What use will that be when you can describe your needs in a natural language to an AI and it will create the application for you?
Same thing happened to learning English - it used to be the smart choice, but now translation software removed that barrier.
User1539 t1_iy3psgp wrote
Yeah, and the thing is, we're nearly at the natural language stage for computers to 'understand' what we're describing. On the other end of things, compilers are creating machine language that's so far removed from our 'code', that you couldn't recognize your compiled code anyway.
So, what 'programming' is, in 2022, is using a specialized language to describe to a compiler what machine code to produce.
If you just had a little machine learning on both ends of that, things would change dramatically. The 'code' you describe could be less precise (no more missing semicolon), and the machine code produced might be much more compact and efficient (recognizing specific cases of matrix multiplication that can be optimized).
We're already basically doing this, it's just one little step to say 'why can't I just tell the computer I need to read through all the records and get the one with the user ID I'm working with?'
With co-pilot you basically can say that, and as that improves there won't be a need for 'programming languages', just natural language descriptions of what each piece does, and then as AI gets better, those pieces will get bigger and bigger until you're just describing applications at a high level.
Eventually you won't even have 'applications', you'll just describe to the AI what you need, and it'll provide it in something it knows is intuitive to you.
Polend2030 t1_ixzx9e1 wrote
I read the same about truck drivers and selfdriving vehicles - but they are not that safe and they will not replace drivers in this decade. How good are ai tools comparing to developer
User1539 t1_iy0g5ni wrote
Eh, we'll see where both self driving and self coding are in a decade. I won't retire for 2 decades or more, so I'm not suggesting it'll happen tomorrow.
NefariousNaz t1_iy2r01e wrote
Is there going to be issue with gap in knowledge due to lack of low end work?
User1539 t1_iy3d99g wrote
Well, how may people write code in assembly now?
I can tell you that I used to do a lot of assembly. I even wrote an X86 compiler for fun, played with the Z80 to write Gameboy games, and did PIC Microcontroller stuff for a while when working on an engineering team.
I don't think I could write anything meaningful for any new processor, really. I could write enough to get a driver working or something, I'm sure ... but memory management and multi-threading?
Truth told, we already have that gap. No one is writing much of anything substantial without tools that handle the low end stuff. Most new languages don't even allow the developer to manage memory, and even when you do it's mostly 'virtual memory' as presented to the program by the Operating System, because the 'real' memory is being managed a layer below most running programs.
We keep abstracting things away, for different reasons.
Most developers have never written a simple 'hello world' in assembly, and even computer engineering students, who's entire purpose is to understand that level of computers probably haven't written anything that really uses the full feature set of a modern processor.
genshiryoku t1_ixzjw1w wrote
Every IT specialist I know including myself is trying to look into switching careers. The end of the IT sector's writing is on the wall. This is like being a specialist working in a car factory in 1970s Detroit.
I feel bad for the couple of IT workers that seem to be in denial because they are going to be the ones hit the hardest as they have no Plan B set up.
Polend2030 t1_ixzwyvh wrote
Wow. Is this copilot and stuff that good? Can a beginner use it ?
I really thought that the it guys jobs will be secured, cant imagine that everyone could automate programming software
AvgAIbot t1_iy04xud wrote
It’s already good now. In 10 years from now it will be insanely good.
vikaramusic t1_iy0a342 wrote
oh come on if you are indeed an IT specialist you will know that there's so much of the job outside of simply "writing and fixing code".
I'm in IT and I know I will have to keep up and evolve but your comment is ridiculously fatalistic
AvgAIbot t1_iy055u0 wrote
What are the careers you’re looking into? Off the top of my head, things like electrician, plumber, and hvac technicians will be hard to replace right?
cirkamrasol t1_iy09z1o wrote
there will always be jobs in security...
Noname_FTW t1_ixzwatp wrote
As someone working in the Software Development Field I am genuinely curious what the future version of the job is going to be. I can't imagine anything an AI couldn't do equally good or better than a human.
It could be that future software devs make desgins and technical supervision. Even if an AI can write code 100% bugfree (unlikely) and test it perfectly (unlikely), someone has to tell the AI what the specification of any process should be. And likely if there is a problem the AI can't solve it will require human intervention to fix/improve the issue.
So while we will be able to produce a piece of software 10-100x faster, iterating through the versions will likely still require SOME technical personal.
jugalator t1_iy0ajhd wrote
Yes, I'm not that convinced an imminent "end" to human software development. Sure, programming may become less manual but I think software architecture/design will remain manual for the foreseeable future.
I can compare it to me getting an awesome oil painting out of Midjourney already. It feels like anything is possible with a ton of power on my fingertips and the text prompt I give it.
BUT! That's not helpful at all in order to match a client specification of something. Let's say a new tool is supposed to integrate with a financial software's output files that was made obsolete a few years ago but still has a decade before being phased out, so they need something to do it. This is a quite normal scenario where I work.
An AI won't help you there just like Midjourney won't help me perfectly creating a drawing that matches a client spec to the letter. It'll create something, sure, but it's only going to impress under the assumption there is no clearly defined spec and it has a ton of leeway in what it creates. If it can handwave something out for you, and that is all you ask from it, then sure it's a great tool. If not, it's awful. I can tell Midjourney to recreate Mona Lisa but only because it's been trained on that popular painting specifically. Instead try to give instructions to recreate her without her name and you're facing hell even if Midjourney is fantastic at painting.
So, I think these jobs will involve a ton of guidance but sure, jobs will disappear. Not the field of software development involving humans though. And a current programmer that keeps reasonably on top of things will probably naturally transition into similar roles, maybe only on a slightly higher level. But you can rest assured not just any guy will start whipping together custom AI-guided Python apps anytime soon, even as AI guidance exists. You'll still need to know Python to deal with AI quirks left behind and fill in the gaps, to begin with. Packaging, distribution, client contacts and bug reports, updates, dreadful long meetings etc etc. The entire lifecycle is still there.
Noname_FTW t1_iy0c675 wrote
I agree. I think there will be companies that will use AI's to create easy simple software solutions in Lego-kinda way where anyone can make their own software like it is currently already with homepages.
But once you get into very complex and specific specifications you will need skilled humans that can guide AI's to the correct result.
Anything else would require AGI and at that point we have basicaly >human intelligence competing against humans. At that point we can no longer sustain the current concept of a labor market.
gbersac t1_iy2yrsj wrote
On one hand I agree with you, on the other hand, we all thought that creativity would be very hard for ai. Result: ai are better at automating creativity (dall-e) than it is to automate driving.
It seems to be pretty hard to forecast what ai will be good at.
jugalator t1_iy339dx wrote
Yeah sure, my argument isn’t that it will be poor at creativity. It’s already great at that. But how it can match fluctuating client specs depending on business situation and which boss they just hired and the vision he/she has, and work together with their lifecycle policy is still unproven and this can introduce a ton of human, illogical factors.
Or if you don’t work as a consultant like me and maybe write iOS games, the tricky bits instead turn into market analysis and understanding what your gamers want.
The act of programming is sort of the easiest problem in software development, lol
But yeah if that’s all you do and is commanded by someone “higher up” what to do in a one-way communication from the top, these jobs are probably most at risk?
My experience is that this is however often only a part of our jobs. I transitioned from that role alone within my first three years or something.
CaptainBlocker t1_ixzw835 wrote
everyone gangsta til AI takes away their job that they thought it would never automate
Juicecalculator t1_iy0cedc wrote
Can they please fix their search engine. I feel like it used to be good now everything that comes up is garbage. I have to put Reddit at the end of everything to get good info
VeryOriginalName98 t1_ixzjkww wrote
"had a secret..."
LudovicoSpecs t1_iy10948 wrote
One step closer to a world where no one knows how to do anything or fix what gets broken.
t0f0b0 t1_iy0t324 wrote
When AI can rewrite and improve its code, the end will have begun.
Nervous-Newt848 t1_iy099yy wrote
There will be less software engineers because of ai coding assistant software but they wont be replaced completely.
Data scientists, AI/ML engineers, will definitely still be needed to improve neural network models.
ihateshadylandlords t1_iy0pzce wrote
It’s no surprise they’re trying to make their own version of co-pilot.
Brangible t1_iy1k0pv wrote
Good, we can fire all this influx of mediocre programmers that expect promotions for shit code additions
[deleted] t1_iy2lgzs wrote
[removed]
stupidimagehack t1_iy1a544 wrote
World’s safest bet: The pitch decks to CFOs are already happening. AI accelerates to market. Seasonal executive: “Who cares if it’s shit code of it makes money. I get my bonus and I’m out.”
immersive-matthew t1_iy2n77w wrote
Wonderful. Please come fix the bug I have been banging my head against the wall to solve so I can get on to the more creative bits.
Foo-Bar-n-Grill t1_iybgyxy wrote
OK, 800 billion lines of COBOL are waiting.
LevelWriting t1_iy25jnk wrote
It's so cute how some people always think a particular domain is safe from ai lol
visarga t1_iy3ooc0 wrote
As a programmer I had to learn a new language every 5-7 years or so. Paradigm changes come one after another. We'll just add AI to the toolbox and use it to write code. Even when AI code works well there is a need to trust it and decide on the various trade-offs. Someone got to get close and personal with the code. By the time it can solve everything by itself we'll be well into AGI, but we'll still get involved in it to express our goals.
Down_The_Rabbithole t1_ixzg548 wrote
Every serious software developer knows our jobs will be gone within 5-10 years time. Most of the smarter colleagues are already teaching themselves people skills or going back to college part time to learn things like psychology because most STEM jobs will be gone very soon and the humanities might be the only jobs around in a decade.
raylolSW t1_iy080qh wrote
The moment you fully automate software development every other profession will be automated within months
turbofisherman t1_iy0qhes wrote
Heh, I disagree. There's an immense amount of freely available data on the internet that can help anyone -both humans and AIs- become great programmers. Just think of GitHub, Youtube tutorials, blogs... and programming is, at its core, a text-output activity, so you don't even need multimodal models. Medicine and most forms of engineering and very different. Automating engineering will be a particularly long-tail problem, because it has a much longer feedback loop than programming and not as much training data.
Business_Royal_201 t1_ixzhxwj wrote
gofyourselftoo t1_ixyovd7 wrote
I’m certain this will work out fine, with zero negative repercussions.
imnos t1_ixyq8xw wrote
Cue the software developers and naysayers saying how bad it will be and how development jobs are safe for many years to come.