jugalator

jugalator t1_jeh12mh wrote

GPT-3 was released three years ago and it took another three years for GPT-4 so maybe yet another three years. It feels like advancements have been super quick, mere months, but this is not true. They just happened to make the ChatGPT site with conversation tuning soon before GPT-4, but GPT 3 is not "new".

I don't expect some sort of exponential speed here. They're already running into hardware road blocks with GPT-4 and currently probably have their hands full trying to accomplish a GPT-4 Turbo since this is a quite desperate situation. As for exponentials, it looks like resource demand increases exponentially too...

Then there is the political situation as AI awareness is striking. For any progress there needs to be very real financial motives (preferably not overly high running costs) and low political risks. Is that what the horizon looks like today?

Also, there is the question when diminishing returns hit LLM's of this kind. If we're looking at 10x costs once more for a 20% improvement it's probably not going to be deemed justified and rather trying to innovate in the field of exactly how much you can do given a certain parameter size? The Stanford dudes kind of opened some eyes there.

My guess is that the next major advancement will share roughly GPT-4 size.

1

jugalator t1_jd427y8 wrote

I guess it's a pretty good version of DALL-E 2 but the generated images still look like sharing the DALL-E DNA to me. I think it's far behind Midjourney V5 and maybe even V4. It succeeded pretty well at my five fingers test though.

Given the recent trend with Bing Chat and GPT-4, I'm a little surprised they broke their streak by not underpromising and overdelivering here. Enthusiasts probably won't turn to this one and it'll be more of a gimmick for now.

15

jugalator t1_jcxw5k8 wrote

> Teach the concepts, then teach a better, more efficient way of doing things.

You just stated why they don't want them in elementary school.

Also, it's hardly punishment to teach young kids why they arrive at certain answers. The calculator skips the steps, so you're doing them a disservice in the long run to not "punish" them. It's going to be a much harsher punishment trying to understand later on because math is really unforgiving to that. The greatest problem in math is generally that the students don't follow out of lacking understanding in preceding courses.

10

jugalator t1_jcfcy6v wrote

Yes, I'm just now trying Whisper out. Yesterday evening I was writing a little tool to use its speech transcription, feed the transcribed text to ChatGPT API and then retrieve the response and have it spoken back to me via Microsoft Azure Neural Voices.

I didn't get it quite done yet but I think I got almost done in a few hours. It'll feel funny to leapfrog Google Home, Alexa and Siri like this in a rare opportunity lol.

It's easy to make though, so no money in it. It's already been made even so it's just hacking for fun. There's a Siri shortcut for ChatGPT too, already been made.

It's pretty wild how an amateur can make this however, and none of these big billion dollar companies have a product to do it.

Very strange feeling and moment in science.

1

jugalator t1_j9xznxx wrote

AI of today already is useful not as an "answer machine" (which is unfortunate because it'll mislead lots of people using Bing AI now, because Microsoft as well as the AI itself gives another impression) but as a very powerful guidance tool.

It may not write software for me, but what it can do is to give me large chunks of code almost correct that I'll just need to do some quality assurance on. So I don't need to problem solve as much myself, instead focusing on bug fixing. Guess which part of software development consumes more time?

This is just one example.

We're also looking at it from other angles in my company. Midjourney is making professional logotypes for our internal and external projects, we're looking into using AI for remote sensing science etc etc.

So, I think criticism like this often boils down to having a too simple worldview without greys, and only blacks and whites. If AI can't solve it all, it's useless.

It's like in politics when you only look for the simple solutions and quick fixes. We have plenty of parties directly engaging these folks because it's well known they are there. They just don't know they are being exploited. Politicians play them like fiddles, presenting quick fixes in time for elections.

AI won't do single things that makes a company go "Welp, that's that. Now we can sit on our asses and cash in!" but instead it's about identifying the places where it can offer aid to your processes.

Taken together, yes, on a large company and depending on the kind of business, the time savings may well earn you $1 million in a year. Salaries aren't cheap in engineering for example.

2

jugalator t1_j9jadh0 wrote

I think there is still a ton to learn about usefulness of the training data itself, and how we can find out what is an optimal "fit" for a LLM? Right now, the big LLM's simply have the kitchen sink thrown at them. Who's to say that will automatically outperform a leaner, high quality, data set? And again, "high quality" for us me be different to an AI?

3

jugalator t1_iy339dx wrote

Yeah sure, my argument isn’t that it will be poor at creativity. It’s already great at that. But how it can match fluctuating client specs depending on business situation and which boss they just hired and the vision he/she has, and work together with their lifecycle policy is still unproven and this can introduce a ton of human, illogical factors.

Or if you don’t work as a consultant like me and maybe write iOS games, the tricky bits instead turn into market analysis and understanding what your gamers want.

The act of programming is sort of the easiest problem in software development, lol

But yeah if that’s all you do and is commanded by someone “higher up” what to do in a one-way communication from the top, these jobs are probably most at risk?

My experience is that this is however often only a part of our jobs. I transitioned from that role alone within my first three years or something.

1

jugalator t1_iy0ajhd wrote

Yes, I'm not that convinced an imminent "end" to human software development. Sure, programming may become less manual but I think software architecture/design will remain manual for the foreseeable future.

I can compare it to me getting an awesome oil painting out of Midjourney already. It feels like anything is possible with a ton of power on my fingertips and the text prompt I give it.

BUT! That's not helpful at all in order to match a client specification of something. Let's say a new tool is supposed to integrate with a financial software's output files that was made obsolete a few years ago but still has a decade before being phased out, so they need something to do it. This is a quite normal scenario where I work.

An AI won't help you there just like Midjourney won't help me perfectly creating a drawing that matches a client spec to the letter. It'll create something, sure, but it's only going to impress under the assumption there is no clearly defined spec and it has a ton of leeway in what it creates. If it can handwave something out for you, and that is all you ask from it, then sure it's a great tool. If not, it's awful. I can tell Midjourney to recreate Mona Lisa but only because it's been trained on that popular painting specifically. Instead try to give instructions to recreate her without her name and you're facing hell even if Midjourney is fantastic at painting.

So, I think these jobs will involve a ton of guidance but sure, jobs will disappear. Not the field of software development involving humans though. And a current programmer that keeps reasonably on top of things will probably naturally transition into similar roles, maybe only on a slightly higher level. But you can rest assured not just any guy will start whipping together custom AI-guided Python apps anytime soon, even as AI guidance exists. You'll still need to know Python to deal with AI quirks left behind and fill in the gaps, to begin with. Packaging, distribution, client contacts and bug reports, updates, dreadful long meetings etc etc. The entire lifecycle is still there.

7